Charlie Hebdo: more security isn’t the solution

We don’t need new anti-terror laws – we need more open political debate.

Predictably, in the aftermath of the atrocities in Paris last week, many commentators have emphasised a presumed need for more security. The UK chancellor, George Osborne, was quick off the mark, asserting that tackling terrorism is now Britain’s ‘national priority’. Really? The number-one focus for some 64million people is to handle the extreme actions of vexatious malcontents? What does that suggest about the malcontents’ presumed power? And what does it say about ours?

The problem revealed here, and in the talk of fighting a ‘war’ against extremists, is a complete loss of proportionality and perspective. We are no less safe today than we were a week ago. We have known for some time that random terrorist acts might strike anyone, anywhere and at any time. And yet now, in the wake of the Charlie Hebdo massacre, politicians are acting as if they had the solution to this threat all along.

UK prime minister David Cameron’s visit to Washington this week, in which he will discuss collaborating with President Obama against the threat of cyber attacks, reflects how lacking in ambition and thought this perceived solution is. In short, all Western leaders feel they can do to tackle the threat posed by terrorism is to intercept the perpetrators before they get a chance to commit their destructive acts. This leaves the central question of why these terrorist attacks are taking place unanswered.

So, billions are to be spent monitoring our movements and communications, while next to no energy is directed at trying to appreciate why it is that the nihilistic rhetoric of a tiny minority is resonating with some people in the West. But tighter security offers only a technical fix and affirms an elite narrative in which we are said to live in uncertain times and among capricious people in need of control. Thinking about what causes the new nihilism, by contrast, would open up a political debate about values and principles that could truly get to the heart of the matter. This, however, is a debate our political leaders would prefer to avoid.

What’s worse, ramping up security in response to terror attacks simply doesn’t work. It is an approach that confuses information with intelligence. What matters most is not how much evidence you can amass but how that evidence is interpreted. Drowning in data, security agents are often unable to see the wood for the trees. As the numbers of people now under surveillance run into the many thousands, the authorities ought to recognise that what they face is not a security problem, but rather a social problem that they have yet to address.

While world leaders marched in solidarity with Charlie Hebdo this weekend, the authorities continue to erode our liberties. Last November, the UK home secretary, Theresa May, tabled yet another counterterrorism bill – the seventh such piece of legislation to be introduced since 9/11. Among other stipulations, this bill will place new requirements on schools and universities to prevent those in their charge from becoming ‘radicalised’ – whatever that means – with severe sanctions for those who fail to do so. Most university officials, including security staff, view these new measures as both unworkable and unnecessary. Yet most universities will implement new procedures so as to be seen to be in accordance with the law. Ministers appear blind to the fact that they now act, and expect others to act, in bad faith.

So, authorities no longer say what they think, while not really believing what they say, either. Few, if any, government officials believe their anti-terror initiatives will work. These measures are purely for show. Politicians and officials have to be seen to be going through the motions.

Crises, such as the one we are facing today, often provoke a race to find meaning in society. Former British prime minister Tony Blair noted after the London riots that there is a danger at such times of crisis to reach for the ‘wrong analysis, leading to the wrong diagnosis, leading to the wrong prescription’. It is imperative, therefore, that we oppose the predictable post-Charlie Hebdo calls for more security and instead kickstart a more rooted social analysis of the issues at hand.

Rather than accepting the supposed need for more protection, we ought to be asking why it is that that our contemporary culture has so thoroughly failed to inspire and engage a generation of young people – to impart in them a sense of meaning, purpose and vision – that some of them are searching for meaning on jihadi internet forums or in the teachings of arcane religious belief systems.

Over the past week, many have repeated the mantra that the first duty of the state is to protect its citizens. That, too, is open to debate. The state itself is the creation of people who were prepared to risk everything, including their lives, to be free. Sadly, the US, in recent years, has seemed determined to make itself the land of the safe rather than the land of the free. It would be a very sad day if the French Republic was to go the same way.

What has most been missing in the so-called war on terror has been a vision for society beyond terror. That is the essence of real resilience: a projection of purpose and a sense of what we are in the absence of all adversities. If we were to achieve this, fewer people would look for purpose elsewhere, and the few that did decide to commit barbaric acts would be framed in the proper context: as mindless criminals.

First published on spiked, 12 January 2015

Dying for a purpose

The absence of meaning in modern war has made combat losses hard to bear.

‘If I should die, think only this of me: That there’s some corner of a foreign field that is forever England.’
‘The Soldier’ (1914), by Rupert Brooke

For almost 300 years, the British Army left its fallen where they fell, as had almost all armies before it. It was only after the Falklands campaign of 1982 that a significant number of bodies were first repatriated – at the behest initially of just one family – and even then, discretely, for fear of demoralising the public and those still serving.

In his latest work, historian Steven Casey of the London School of Economics explores America’s treatment of war losses during the twentieth century. In his telling, officials and politicians will always be caught between a rock and a hard place when it comes to dead soldiers. To underreport or not to attend to the return of the dead is viewed as an attempt to conceal the harsh realities of war from the public, as well as being seen as insensitive. But, at the same time, high-profile send-offs of the dead can be seen as self-serving.

Fundamentally, Casey sets out to explode the mantra established by US political scientist, John Mueller, who, during the Vietnam War in 1973, asserted that ‘as casualties mount, support decreases’. Casey’s effective retort – ‘not always’ – is unconvincing, as he fails to draw out why support doesn’t always decrease as casualties mount.

Nevertheless, When Soldiers Fall, presented cogently and chronologically, provides plenty of useful material to allow readers to draw their own conclusions. Primary among these is the growing inability of governments to project purpose outside of war, let alone through it. To ‘die for a lie’ in Iraq, for example, is considerably worse than to ‘die for a tie’ in Korea.

It is this changing ideological context that provides the backdrop to a story that opens with a relatively small and inexperienced (by European standards) US War Department in 1917 – the general staff consisted of just 20 officers – that evolved into the mightiest military on Earth just half a century later.

It is a period of undeniably rapid technological change – not least in reportage, from newsprint through radio and television, to today’s live-streaming internet coverage, which now escapes the control of a once powerful few. But technology alone does not determine outcomes, as the advocates of ‘technowar’ – from air power to smart bombs – were to discover.

Rather, Clausewitz’s dictum that war is ‘the continuation of politics by other means’, affected by domestic ‘friction’ and conceptual ‘fog’, might be a better place to start. Winning a war relies on the buy-in of the military (just as much as of the public), rather than just a nation’s technical superiority over the enemy. Spirit can matter more than kit.

Admittedly, mundane elements do affect the treatment of fatalities and casualties. Not least, as Casey examines, how to count the dead and the wounded. Losing a battle does not create a situation conducive to conducting elaborate or accurate body counts. Equally, defining serious injury is a moving feast, complicated when casualties are treated (increasingly well and quickly) and returned to the front.

No doubt, there will always be challenges involved in the treatment of the dead. But these will not be resolved through better communications or PR strategies. It is the political dimension that will prevail.

From the war to end all wars and its reprise after Pearl Harbour, through Korea and Vietnam, to Afghanistan and Iraq (with the odd foray into Somalia and elsewhere excluded), Casey identifies the painstaking lengths to which the US military went to identify and count fatalities and casualties, as well as its deliberations over how and when to present these to the public.

Implicitly, this is a story about US rulers’ changing views of the public – from partners who could be trusted to share the same values and outlooks, to concern as to how best to keep the masses on side. But, as Frank Furedi has identified, experiencing problems as relating to trust psychologises the real driver – a crisis of authority.

Casey recognises that the domestic front was always hungry ‘not only for news… but also for analyses’, effectively conceding that making sense of conflict can matter just as much as numbers. Readers will have to look elsewhere to understand this gradual inability to imbue war with any meaning across the twentieth century. It is this that matters more than Mueller’s casualties.

It is why, as Casey correctly identifies, by the time of Vietnam, many in the military, just as much as in the media, had become uncomfortable with the language of death. GIs were never killed; ‘they were aced or greased, waxed or zapped’. But if the military itself used terms such as ‘wasted’, what did this say about its belief in the mission?

Along the way Casey does make some fine observations – not least regarding how in 1917, 1941 and 1950 the United States had gone to war with a segregated army. Unhappy that their role had been diminished and portrayed as one of support or lacking combat capability, many black voices demanded the ‘right to fight’, with tragic consequences in Vietnam. Casey cites Senator George McGovern who, in the aftermath of his 1972 presidential election defeat to Richard Nixon, ruefully observed: ‘When the corpses changed colour, American interest diminished’ – a sad indictment of the anti-war movement, which was more preoccupied with the shooting, by Ohio national guardsmen, of unarmed college students at Kent State University in 1970.

Another sorry truth is that, both in terms of fatalities and casualties, American losses during the Second World War amounted to more than twice those from all its other conflicts in the twentieth century put together. This prompted the US authorities to focus on promoting objectives rather than identifying losses which, as Casey notes, there was hardly time to audit and report anyway.

In his wry essay ‘The Gulf War did not take place’, French philosopher Jean Baudrillard highlighted quite how ‘safe’ America’s wars subsequently became. Mistakenly, Casey sees the lower death toll as a reason why personal narratives now receive far greater prominence. But the media’s focus on emotion represents a retreat from political debate.

Most significantly, as I have argued elsewhere, values such as honour, duty and glory appear entirely anachronistic in an age when they have lost their use and meaning. When the necessity to fight – even for ideas – is dismissed, and attempts to impart a vision or direction discredited, then the game is up, and worse – destructive dissent is to be expected.

One consequence of conflicts’ loss of meaning has been the increased focus on technical processes such as auditing and communicating casualties, instead of clarifying a purpose. Yet societies possessed of a sense of mission – however misguided – have been able to countenance and withstand the most remarkable forms of barbarity, without first having to prioritise force protection.

Casey usefully reminds us that it was the Republican Party that opposed all of America’s major twentieth-century wars (two world wars, Korea and Vietnam). But his view that ‘in any war, faced with an information vacuum, the press and public tend to speculate’ confuses things. After all, the prevalence of rumours is also a measure of political confusion.

Before losing his life, Rupert Brooke continued the poem that opens this review: ‘There shall be in that rich earth a richer dust concealed; a dust whom England bore, shaped, made aware.’ Brooke’s soldier believed in something that gave his life, indeed, his death, a meaning. It is this loss of meaning that makes combat losses so hard to bear today – not just the absolute numbers.

When Soldiers Fall: How Americans Have Confronted Combat Losses from World War I to Afghanistan, by Steven Casey, is published by Oxford University Press.

First published on spiked, 8 August 2014

Putting the human soul on the slab

Human behaviour cannot be understood through brain scans.

If you step outside right now, you’ll find a lot of leaves have fallen on to the pavement. Some have blown away, but have left an imprint there, so you get these beautiful patterns left on the paving stones. If I were to show you a photograph of one of those prints, I think most people would say that they saw the print of a leaf. But that’s not what you would see; that would be your interpretation of what you see. What you would actually see is a patch of colour superimposed on another patch of colour.

You interpret what you see in this way because you have prior experiences of seeing leaves; you would have experienced the leaching of pigment from leaves that remains on the pavement, and you have experiences of things changing over time – that something that was there is no longer there. In fact, most human activity is the interpretation of data that we perceive through our senses rather than the mere representation of that data.

Such interpretations can be contested. Science tries to avoid that contestation by the repetition of experiments, having large numbers of results and looking at the significance of the numbers that are revealed.

One problem facing neuroscience is that it is plagued by very small signals generated from very small numbers of volunteers who are prepared to put their heads into a scanner, which are then enhanced through the amazing imagery capacity of information technology. But those images are still contested.

Another problem is that neuroscience is in many ways a fledgling science and plagued by numerous disagreements, even among its proponents. For example, some would argue that a brain on its own does not explain much; it needs to be understood in context – with its environment, with other brains. Other neuroscientists may not see it that way. This contestation of interpretation and meaning is crucial to understanding what neuroscience can and cannot tell us.

Others have also pointed out that some neuroscientists notoriously try to smuggle the language of consciousness into how they describe the processes going on in the brain. They will say things like neurons ‘signal’ or ‘provide information’ or ‘respond to’. But neurons can’t do those things; that’s what we as human beings do. Neurons simply generate and transmit electrical impulses.

Neuroscience is also plagued by vague language. So neuroscientists may talk about one phenomenon being ‘associated with’ or ‘influenced by’ another. These are descriptions, not explanations.

It’s also the case that we perceive many things simultaneously. In an experimental setting, we somehow have to prioritise the experiences the experimenter is asking us to focus on, yet there may be activity going on at the same time in our brains that we are not conscious of. For example, we are constantly maintaining a state of homeostasis, such as keeping our balance or an optimal body temperature, which may cloud other things that are going on.

I don’t say these things to dismiss neuroscience, but because it is important to say that it is a contested field with some important barriers.

Neuroscience also suffers from presenting a deficit model of the brain. There is a lot of focus on what happens when a part of the brain is damaged and what this supposedly reveals about what would normally be there. But suggesting that normality is the opposite of damaged is a bit like trying to study democracy through only looking at dictatorships because it is assumed that one is the opposite of the other. That’s not true – and that indicates that there are limits to some of the experiments that are currently being reported.

In other words, the study of the brain is incredibly complicated. If I wanted to study a butterfly colony, for example, I could say that all things are ultimately physical, that butterflies are made of protons, neutrons and electrons. Therefore, all I need to do to understand that colony is to sum up where all these particles are, how they interact and how they move across time. In reality, such an effort would simply be far too time-consuming; there’s not enough time in the universe to determine what would happen. Hence, I don’t use physics to understand butterfly colonies.

My thoughts and actions may be reflected in my brain but they are part of a social field of relations, not just a neural or chemical set of relations. For example, a slip of the tongue may leave a neural or chemical signature that can be measured, but it is a really uninteresting way of describing what happens. My degree of embarrassment will have nothing to do with that neural signature; it will have to do with social context. Likewise, I don’t think neuroscience is about to explain anything sophisticated or important like the existence of slavery or sex discrimination.

What is clear, however, is that the language of neuroscience has been hijacked by some people in order to further pre-existing political agendas. I largely work in the field of security and I could show you a whole series of US Department of Defense white papers in this vein, with titles like ‘The Neurobiology of Political Violence’, ‘Neuroscience Insights on Radicalisation’, and so on. The language of neuroscience has been hijacked and adapted to many other fields.

However, to suggest that neuroscience is corrupting or influencing society is to put things entirely the wrong way round. Science, while shaping society, is also a product of the particular society and cultural mood from within which it emerges. The current cultural mood could be categorised as one dominated by a fear of change, a sense of limits, and a feeling of fragility which encourages a particularly dystopian outlook. That dystopian outlook, whether they know it or not, guides many scientists as to what they go off and investigate. As a consequence, we have apocalyptic interpretations of environmental science and deterministic presumptions presented by neuroscience.

Karl Marx, in his introduction to A Contribution to the Critique of Hegel’s Philosophy of Right in 1844 (an article most famous for the idea that ‘religion is the opium of the people’), made two points that are very appropriate here. Firstly, he writes: ‘Theory becomes a material force as soon as it has gripped the masses.’ So here we have Marx, a materialist, who understands that matter is not all that matters, and that ideas can have a material impact.

Secondly, he writes: ‘The immediate task of philosophy, once the holy form of human self-estrangement has been unmasked, is to unmask self-estrangement in its unholy forms.’ That seems very pertinent to our current situation. Having been through a period when we have revealed the limits of religious determinism, now we need to reveal the limits of neurodeterminism.

This is an edited version of a speech given at the Battle of Ideas festival on Sunday 20 October 2013 at the Barbican Centre in London. You can watch video of the whole debate, ‘Soul on the slab: is there no limit to what neuroscience can do?’, at WORLDbytes.

Terrorism: a homegrown fear

The enemy in the ‘war on terror’ was created by a lack of meaning or purpose in the West.

When I was coming up, it was a dangerous world, and you knew exactly who they were. It was us versus them, and it was clear who them was. Today, we are not so sure who they are, but we know they’re there.

With these words in 2000, given before he was elected US president, George W Bush captured some of the uncertainty that had gripped the US establishment in the long aftermath of the Cold War.

Celebrated by some, most notably Francis Fukuyama, as heralding the ‘End of History’, the dismantling of the Cold War framework that had largely organised world affairs (and shaped identities) – both internationally and domestically – across much of the twentieth century proved unsettling for all those who understood themselves through it.

Such confusions continue to this day, and not simply in the US. After a recent terror-related incident that targeted the vicinity of the Legislative Buildings of British Columbia on Canada Day, the BC premier Christy Clark announced: ‘They want us to be governed by fear. They want us to look on each other with suspicion. They want us to be seized with anger. They want this because they hate the things that make us Canadian.’ But, as some analysts immediately noted, who exactly were the ‘they’ that she was pointing to?

In this case, ‘they’ would appear to have been a petty criminal and failed heavy-metal musician turned Muslim convert, and his methadone-taking, common-law wife, neither of whom particularly kept their dislikes discreet. And – just as significantly – what exactly are ‘the things that make us Canadian’ (or American, or British, or anything else for that matter)?

As the British writer James Heartfield notes in his critique of the postmodern outlook, The Death of the Subject Explained, constantly calling into question the object of our attention also points to confusion relating to the subject – ourselves. Yet, almost 10 years into the ‘war on terror’, US president Barack Obama would still write in his foreword to the 2011 US National Strategy for Counterterrorism: ‘To defeat al-Qaeda, we must define with precision and clarity who we are fighting’.

Not only have we failed to understand the enemy but, more importantly, we failed to grasp the extent to which we have changed, too, and how this shapes those we confront. It is our lack of vision and direction for society that generates confusion over who the enemy is in the war on terror, and how to respond to them.

Interpreting meaning

The common adage that ‘generals always fight the last war’ could be augmented to include all manner of other professionals – including politicians, media commentators and even intelligence analysts. A mental model once ingrained is truly difficult to shake off.

The atrocities of 9/11 necessitated a response, but the declaring of a ‘war on terror’ was by no means the only possible one. Compare that with the response of the mother of Dutch filmmaker Theo van Gogh who said of her son’s murder at the hands of a self-styled jihadist in 2005: ‘What is so regrettable … is that Theo has been murdered by such a loser, such an incoherent person. Murder or manslaughter is always a terrible thing, but to be killed by such a figure makes it especially hard.’ As the Holocaust survivor Viktor Frankl (1946) noted over half a century before, it is not suffering that destroys people – but suffering without meaning.

So, after 9/11, a meaning – political ideology – was presumed and projected. It allowed a disoriented administration the semblance of clarity and offered a cohering mission to society. They were facilitated in this by the perpetrators themselves, whose chatter about global jihad was taken at face value.

In a similar way, the failure to find weapons of mass destruction in Iraq led – in 2003 – to its invasion. Any lack of evidence was either ignored or taken to confirm pre-existing views regarding how devious the regime was. Either way, policy needs and presumptions – not evidence – determined outcomes.

The same is true of much intelligence. This necessarily combines information with the interpretation of that information. Yet, time and again, when examining intelligence failures the tendency is to blame just the information, either because insufficient information is highlighted, or there being too much to analyse. Alternatively, analysts worry about being provided with false, or misleading, information.

What is rarely questioned is the framework through which that information is interpreted. So, because in the past protests and violent outbreaks usually had a political or ideological purpose, today politicians, commentators and analysts look for political and ideological explanations – even when all the evidence points to the absence of these.

In the past, groups such as the Irish Republican Army (IRA) and the Palestine Liberation Organisation (PLO) fought national-liberation struggles. They used terror as a tactical means to achieve their strategic ends. But they knew above all that they needed to win the hearts and minds of their own communities.

In other words, they relied on mobilising a conscious and coherent collective. And they confronted an equally conscious and coherent state. Failures, on all sides, can be traced to their alignment – or not – with the people they claimed to speak and act on behalf of.

But al-Qaeda and the offshoots it supposedly inspires could not be more different. While some claim to speak on behalf of the ‘Ummah’, there is no evidence of any community ever having been consulted – let alone engaged. That is why even the families and friends of those involved express shock to hear of their activities.

Nor is there any coherent text outlining the purported mission or aims of these groups. Rather, much of this has been projected for them by analysts who seek to fill the vacuum of information left behind after the various acts of destruction with their own pet prejudices. A striking example of this is, when asked to articulate their demands on television, one of the perpetrators of the 2008 Mumbai attacks was heard placing the phone down and asking one of his co-conspirators what their demands were.

Even if the perpetrators were mindless cannon-fodder as some have suggested, and even if we know the real origins of these attacks, this still fails to explain why no one has come forward to claim responsibility for this incident, as well as many others. Even when someone does claim responsibility – through so-called martyrdom videos and other media – there is precious little content other than a rambling rage.

Our failure is to attribute meaning – either political or ideological – to these actions. We thereby imbue vexatious acts of violence with greater import than they deserve. By doing so, we also attribute far too much authority and power to small numbers of individuals.

Implicitly, we also identify a gaping hole at the heart of our own societies – where ideology and politics should be. For what kind of society is it that can be so rattled by events that – in perspective – should be seen as minor, if unfortunate, historical footnotes?

Some analyses even effectively exonerate the individuals concerned by finding cause for them in the conditions of the developing world and our supposed insensitivity to these. Above all, our responses have allowed local and regional struggles, as well as isolated, irrational acts, to be presented as conflicts of global and epochal proportions.

Reflected caricatures

Osama bin Laden himself was fond of citing Western politicians, commentators, academics and diplomats in seeking to legitimise his ostensible cause. Sounding like any other contemporary critic of American policy, he droned on about a rag-bag of motives at different times. From primarily complaining about the relationship between the US and the Saudi regime, he switched to focusing more on Palestine after the events of 9/11 and then only later to Iraq, echoing the anti-war lobby’s claim that the war was simply a money-making venture for large corporations.

He lambasted the US for not signing up to the Kyoto treaty to control greenhouse gases, accused Washington of being controlled by a Jewish lobby, and argued that Western advertising exploited women. After the Madrid bombings of 2004, he even proposed that Western leaders should have paid more attention to surveys that revealed how few people supported going to war in Iraq.

In all of these, bin Laden and his acolytes revealed themselves as being entirely parasitical upon the caricatures and dystopian views that proliferated in, and emanated from, the West, as well as being obsessed with what was being said about them. One of the final images of bin Laden – sat watching himself on television – is quite apposite in that regards.

But what kind of Muslim leader is it who advises people to read the journalist Robert Fisk or the academic Noam Chomsky rather than, as one might have supposed, the Koran? And why did bin Laden choose to piggy-back his claims on Western opinion-poll data and the views of environmentalists in order to get his points across? (Although we should note that contemporary political leaders and religious figures in the West do much the same thing.)

Ayman al-Zawahiri – once right-hand man of bin Laden and the group’s supposed intellectual – displayed a similar tendency to draw ideas and inspiration from Western concerns when he noted, in relation to his growing, if evidently unrealistic, fascination with developing some kind of chemical or biological weapon: ‘Despite their extreme danger, we only became aware of them when the enemy drew our attention to them by repeatedly expressing concerns that they can be produced simply with easily available materials.’

In truth, bin Laden and al-Qaeda entirely lacked any substantial ideas of their own, let alone anything that amounts to an ideology. Bin Laden was the leader of nothing, who became – in an age enthralled by celebrity – the iconic terrorist of our times, unable to control his own fans never mind the course of history. Sadly, only in an age when image and style trump insight and substance at every turn could such aimless violence prompt such an all-consuming response.

Criticism of the West has long been around, but never before has it taken such a degraded form as in our post-political age. Even the presumed rise of religion in the recent period points to the evisceration of political engagement. And there is a world of difference between the cult-like religiosities of the present and traditional, religious organisations – though the former may better countenance rash acts of barbarism through their being less accountable to any wider institutions or mores.

Homegrown nihilists

Far from being atypical, recent self-styled jihadists intercepted in the domestic arena have exemplified the ineptness of the ever-expanding roll-call of marginal fantasists and wannabe terrorists who claim to be part of, or inspired by, al-Qaeda.

spiked‘s Brendan O’Neill has noted elsewhere, the tactical, technical and organisational incompetence of many modern terrorists, irrespective of their economic or educational backgrounds. And these form just the tip of the iceberg. This is not to dismiss the potential lethality of these plots and the devastating consequences they could have had upon those in their proximity if they been successful in their aims. Nor should we confuse them with the more serious threat posed to troops in Afghanistan, Iraq and elsewhere.

Yet, after each of these incidents, rather than point to the combination of vacuous bravado and concomitant failure, politicians, commentators and analysts have preferred to pursue purported links to al-Qaeda, which they invariably make connection to, however tenuously.

But associating with groups such as Al-Mujahiroun or Jemaah Islamiyah, travelling to Pakistan to attend some kind of training camp, or surfing jihadist websites including the now notorious Inspire magazine – supposedly al-Qaeda’s web-based English language organ – does not explain anything.

Ideas do not transform people unless they resonate with their experience and existing interpretation of the world. Why do the ideas of fringe organisations appear to fall on such fertile soil? What is it about the West that seems to predispose some to identify with such nihilist groups?

In view of the sheer weight of alternative media to Inspire, how has our society failed to inspire individuals who are often young, bright and energetic, and provide them with rules, structures and meaning to live their lives by?

Ultimately, ideas have to emerge from somewhere. And extremism is the extreme expression of mainstream ideas. If our aim is to stop the extremists, we have to address the mainstream ideas that drive them.

In the most recent incidents – in Boston, London and Victoria, British Columbia – as well as many others, what we find are individuals consumed by a sense of self-righteousness. Islam – if it features at all – is often more an afterthought than a driver. It is their motif, not their motive.

But moral indignation is encouraged by contemporary society, which often presents a negative view of the present combined with a dystopian projection of the future. Disengaged from what passes for politics today, many young people come to develop an aggressive sense of entitlement, indulged by a society they seek simultaneously to distance themselves from.

The outcome covers the spectrum from asserting a new identity – young women wearing headscarves whose mothers never wore one – to inchoate rage, expressed either passively, in the so-called Occupy movement, or more acutely and violently, as in recent episodes of rioting. It is the unpredictable emergence of the latter that has led some analysts to express their surprise at how rapidly so-called self-radicalisation can occur. In fact, it is the failure of observers to identify the social currents beneath the surface that leads them to viewing matters this way.

Indeed, the parallels between ‘homegrown terrorists’ and other ‘lone wolves’ – such as Anders Breivik, who murdered 77 people in a bombing and shooting spree in Norway in 2012 – as well as the perpetrators of various mass high-school shootings (another relatively recent phenomenon), are more important than any purported political or cultural differences.

Domestic drivers

Space here precludes a detailed exposition of the various social, economic, political and cultural drivers of these trends that were largely catalysed into being only recently.

That modernity itself produces turmoil and disruption, while generating constant uncertainty, has been known for a long time. Marx and Engels noted as much in 1848 in The Manifesto of the Communist Party. But over the course of much of the twentieth century, the Cold War effectively kept the potential for change in check, by demanding adherence to particular worldviews.

The stand-off between the US and its allies against the Soviet Union and its satellite states across Eastern Europe and elsewhere, divided the world externally and was reproduced internally against the ‘enemy within’, understood then as emanating from trade unions or the political left.

But from about the mid-1980s, the erosion of the supposed twin threats of Soviet-style Marxism and state socialism – finally made evident through the unanticipated fall of the Berlin Wall in 1989 – opened the flood gates on the possibility for both public/political and private/personal transformation. This also encouraged the erosion of the distinction between these domains.

Without the forces that had held the political right together for so long, establishment elites were soon exposed as lacking any positive purpose or vision for society, and rapidly fell out among themselves. Replacement enemies were postulated, but none of the new litany of demons – from the Contras in Nicaragua and General Aideed in Somalia, through to Slobodan Milosevic in the former Yugoslavia and Saddam Hussein in Iraq – could match the military, moral and material caché of the Red Army.

Little wonder then that even freedom-advocating, Cold War warriors would oppose change when it came. For example, Margaret Thatcher, briefing then Soviet president Gorbachev in private meetings, told him that the lifting of the Iron Curtain and German reunification would ‘undermine the stability of the whole international situation and could endanger our security’, adding that – despite public pronouncements to the contrary – US president Ronald Reagan was of the same view.

New organising frameworks for society have struggled to fill the void left by the erosion of the old political and moral frameworks shaped by the interest-based politics of left and right. Ideology has – to some extent – made way for identity, but, as some have noted, the latter is a very fragile sense of identity, based on a ‘diminished’ sense of human agency.

That is why there is such resonance today for prevailing discourses that emphasise risk and uncertainty – despite these always having been part of the human condition. More problematically, this culture also elevates our sense of vulnerability over resilience, irrespective of official intent.

Even those charged with defeating terrorism buy into such negative narratives, pointing in their turn to the possibility (rather than probability) of future catastrophes (variously to be caused by limited resources, viruses, climate, population, the economy, technology, and other forces). They then imagine and act upon worst-case scenarios rather than focusing on the most likely.

In the past, such pessimistic projections would have been condemned as a loss of nerve that encouraged low morale; today, they are considered sensible precautions. They impact not just counterterrorism but upon all walks of life. For example, foreign governments encouraged their nationals to flee the vicinity of Tokyo in the aftermath of the Fukushima power plant emergency triggered by the Great Tohoku earthquake – rather than humanely staying behind and helping those they had been with.

A similarly shallow deterministic outlook explains why the rudimentary findings of neuroscience and simplistic business models have been co-opted to shed light on the causes and trajectories of terrorism. This is possible because they present a process without a subject in an age when our sense of autonomy and potential has been so curtailed. Accordingly, biological metaphors (ideas go viral, terrorists are spawned, etc) proliferate, as these also downplay our role and intentions (as well as – inadvertently – our accountability, too).

Nervous responses

By retreating from political ideology to process management in the West, uncertainty has effectively been allowed to drive world affairs rather than emerging from them. A concomitant sense of insecurity has encouraged politicians and people everywhere to avoid expressing firm principles and values independently of simply managing perceived, exogenous threats.

But it is how we, as a society, respond to acts of destruction that determines their impact. Civilisation cannot be bombed out of existence by terrorists. It can, however, be corroded from within if all we do is focus on technical solutions to our problems rather than expanding our horizons through a strategic vision that could project a positive sense of mission for society.

In effect, we complete the acts perpetrated by domestic nihilists. When the UK prime minister David Cameron flew back from his overseas engagement to be seen to be addressing the brutal murder of an off-duty soldier on a London street, or when the city of Boston was put into lockdown by the authorities pursuing an injured teenager on the rampage, no amount of words extolling our resolve and resilience could alter the implicit message of societies disoriented by adversity.

Not only does this act as an encouragement to other loners and losers with an exaggerated sense of self-importance and grievance, it also flies in the face of the real solidarity and fortitude displayed by those most immediately affected. Such resolute responses at the time are then further undermined by the countless medical experts, media commentators and officials who all project about the long-lasting consequences on individuals and society that such attacks are held to have.

In 2003, the then UK home secretary David Blunkett suggested in relation to one of these losers that the youth concerned posed ‘a very real threat to the life and liberty of our country’. What kind of country is it that can feel so threatened by the actions of such marginal figures?

Sadly, the focus on surveillance, protection, information and warnings that has emerged since 9/11 has the unintended consequence of promoting undue concern, mistrust and cynicism. It pushes people further apart from one another at a time when they need to be drawn together with a sense of common purpose. It also exemplifies the low view of the public and their likely responses evidently held by many in authority.

As opposed to the contemporary obsession with needing to identify unanticipated shocks to the system, it is the long-term drift at the top of society that will prove to be more destabilising in the long run. That is, the drift created by consistently seeking to protect society from without rather than revitalising it from within, and the gradual disengagement and distancing this fosters.

Dystopian projections

Less than 48 hours into the war on terror, British journalist Seumas Milne had an opinion piece published about the US: ‘They can’t see why they are hated’. Others soon followed, leading to expressions of outrage by establishment commentators. What they failed to notice was quite how normal such expressions of anti-Americanism had become.

A sense of contempt for supposedly soulless American consumerism is widespread – even among those working for the likes of Google and Citibank. And surely when Michael Moore’s Stupid White Men (2001) became a bestseller on both sides of the Atlantic – selling over 300,000 copies in the UK in its first year of publication alone – this should have alerted a few bright minds in the security agencies (and beyond) to a self-loathing that is significantly domestic in origin.

This has little to do with America itself, but rather reflects a broader dissatisfaction with the world that targets the US as its highest expression of success. That debate had been fulminating for quite some time, particularly among the old political left. But the events of 9/11 catalysed – rather than triggered – the soul-searching across the board to a new level.

It is striking how common it is today to read book titles such as Alan Weisman’s The World Without Us, or hear respected academics describing humanity as a ‘plague’. These, and countless others like them, point to the low view we have come to have of ourselves in the contemporary world. They point to a significant clash within civilisation, rather than to that between civilisations as characterised by the American political scientist Samuel Huntington.

Unfortunately, such ideas serve to reinforce a cultural milieu within which low expectations and dystopian fantasies become the norm. But such a dismal view of ourselves, our role and our impact on the planet can become internalised by some. It frames a demoralised public discourse of apocalyptic failure and rejection that sustains those prepared to lose their lives – as well as those of others around them – in their misguided determination to leave their mark upon a world they feel encouraged to reject.

Conclusion

America found itself, at the turn of the last century, an undisputable – if somewhat reluctant – world power. It more formally attained that role propelled by events elsewhere – but also inspired by the narrative of ‘manifest destiny’ built on the Enlightenment optimism of Washington, Adams, Jefferson, Madison and others.

By the close of the century, America appeared more gripped by a sense of Millenarian pessimism. Built not on size, but on the initiative of those confronting the unknown, its founding and guiding ideology was that of freedom – freedom from the past, and freedom of conscience, initiative, enterprise and of will.

The US, as immortalised by Francis Scott Key in his poem of 1814, was ‘the land of the free’ – not the ‘land of the secure’ – as it appears some today would have it. He understood that people in all places and at all times had been prepared to risk it all to achieve this.

We do not just live our lives – we lead them. And similar aspirations have inspired the struggles of others, however distorted these became in the years that ensued. To lose sight of this, to trade our freedom in order to be looked over by others and made to feel secure, is just one of the confusions that now grips America.

But the forgotten role of leaders today is to inspire people – not just to protect them. People who believe in their cause or project are far more effective agents of it than those who are coerced, managed or nudged.

What is most missing in the war on terror has been a vision for society beyond terror. That is the essence of real resilience, a sense of what we are for in the absence of all adversities; a projection of purpose. Otherwise, as is the case here, we effectively allow the challenges we confront to determine us rather than the other way round.

America still represents much of what is best in the world – as well as a little of what is worst. For all the challenges still confronting it, as well as the pretensions and delusions of others, the future remains for America to lose rather than for others to win. But over a decade into the war on terror, it is high-time for America’s search for meaning to conclude through the re-invigoration of its founding values, as well as the identification of a new vision.

That way, many of the disillusioned individuals who look elsewhere for purpose and meaning would not need to, and the few that get through would be framed in the proper context – as mindless criminals.

First published on spiked, 11 September 2013

Precautionary Tales – Missing the Problem and its Cause

Extract: Two recently published volumes on the concept of precaution as it is variously understood and applied across the United States and in Europe make for a fascinating comparative analysis. They also respectively offer some undoubted and invaluable insights into the subject. Sadly neither really addresses how precaution came of age or why.

Precautionary Tales: Missing the Problem and the Cause, European Journal of Risk Regulation, Vol.4, No.2, pp.297-299, June 2013

The spy who came in from the Cold War

The Red Army toilet-raiding realities of spying certainly exhilarated Steve Gibson, but the fall of the Berlin Wall brought doubt, too.

From the end of the Second World War through to the end of the Cold War, a little-known unit of British special forces conducted spying missions behind the Iron Curtain – that is, right from the heart of Soviet-occupied East Germany.

Called the British Commanders’-in-Chief Mission to the Group Soviet Forces of Occupation in Germany, or BRIXMIS for short, it was part of an officially sanctioned exchange of observers between the Red Army and the British Army established by the victorious Allied powers and the USSR through the Robertson-Malinin agreement in 1946. Its ostensible purpose was to improve communication and relations between them.

In addition to BRIXMIS – and their French and American counterparts in the East – the Red Army also conducted similar operations through a unit in West Germany. But, diplomatic liaison and translation duties aside, the real purpose of these units soon became clear: to find out what each other was up to by heading out into those areas where they had been specifically told not to go.

My friend and former UK Defence Academy colleague Steve Gibson led many of these ‘tours’ just as the Cold War was coming to an end. Live and Let Spy is his gripping recollection of these episodes. Although originally published in 1997 as The Last Mission: Behind the Iron Curtain, it has now been republished and augmented some 15 years later with a significant additional chapter written with the hindsight gained during his subsequent academic career.

Much of the espionage involved gathering evidence about the weaponry available to the Red Army. Accordingly, it typically required lying in wait on a bridge over a railway line at three o’clock in the morning in the middle of a forest in winter. With temperatures dipping to around minus-30 degrees celsius, the objective was to photograph all the kit that passed by on a train underneath. Alternatively, they might record the rate of fire from Russian guns from the safety of their locked vehicle in the sweltering 40 degree heat of summer.

Sent back to the Defence Intelligence Staff in Whitehall, this information allowed specialists to determine troop and equipment levels, as well as whether a new bolt on a gun or aerial on a tank might allow it to fire or communicate further than previously estimated – and if so, whether this would necessitate the complete re-evaluation of NATO’s Cold War battle plans.

Of course, the operations required meticulous planning to identify suitably concealed observation posts, as well as efficient access and escape routes. This planning was usually conducted during the day. For those so disposed, there are sufficient ‘tradecraft’ details here to sustain interest. For me, however, the real gem is the lesson identified early in the book – to be as conspicuous as possible by waving at everyone.

This waving tactic disoriented many into believing that the Mercedes G-Class (Geländewagen) passing by with three individuals in army fatigues in it was legitimate. And even if observers suspected something, the fact that nearby children would invariably wave back – raising the possibility that those inside were known to them – would add further confusion or delay. Those who did smell a rat usually did so too late.

It was not just a jolly jaunt. Over the years, a number of tour personnel lost their lives or suffered serious injury through being shot at or having Russian tanks ram their vehicles. East German ‘narks’ were also always on the look-out for anyone in the wrong place and would report these to the relevant authorities. It is noted, though, that many local ‘Easties’ were keen to help the agents.

Given the challenging circumstances, selection and training were intense and severe. It required individuals who could think quickly on their feet and not just expect to follow rules. It also meant having the ability to complete advanced courses in Russian and German, photography and navigation in next to no time, and to memorise the look and sound of countless pieces of Soviet military equipment, as well as remain calm – yet sharp – when tired or provoked.

For anyone who imagines that spying is glamorous, or somehow akin to being in a Bond movie, they will be disabused by Gibson’s chapter on document-gathering from dumps (literally). It had been recognised for some time that, when they went on manoeuvres in East Germany, the Soviet forces were not supplied with any toilet paper. They would use whatever came to hand – a copy of Pravda, a letter from a loved one, or even their mission papers. And after they were done, it was then that Her Majesty’s specially trained and equipped Cold War warriors really came into their own…

The book is a tour de force of teeth-clenching tension that will keep most readers gripped from beginning to end. But while the first nine chapters retain the action-packed core of the original narrative, filled with the escapades of small teams of rather special individuals trying to find out what the Soviets were up to, the real substance – for those of a more political disposition – is a chapter titled ‘Reflections’.

As a professor of political science at the University of Warwick, Robert Aldrich, notes in the new foreword, Gibson is now clearly of the mind that ‘much of what [he] was led to believe [during the Cold War], and some of what he was told, was simply wrong!’

It is testimony to the author’s strength of character that – unlike others – he neither chose to dwell in the past nor fell prone to the ‘invention of illness’. This latter problem, he himself notes, affected many of his one-time colleagues once their personal and moral frameworks disintegrated with the end of the old, Cold War world order.

Gibson’s resolute clearsightedness is to be admired. So despite having been caught up in the exhilaration of it all as a young man, despite devoting the prime of his life to the East-West conflict, he refuses to lie to himself. ‘The Cold War’, he notes, ‘was a giant historical cul-de-sac where all enlightened efforts at producing a good society were suspended’.

Aldrich astutely summarises a key argument of Live and Let Spy: ‘while Cold War warriors fought a tyrannical and ruthless version of Communism abroad, they remained ignorant of – and lost – an ideological battle at home’. He then adds accusingly: ‘Western politicians now offer a watered-down version of the interfering, intolerant, controlling and authoritarian government that they were initially set against rather than anything freer.’

In this, he takes his lead from Gibson, who rails against the erosion of ‘moral values, community spirit and sense of purpose’ that now pervades Western political elites. They are ‘pessimistic and misanthropic’, Gibson argues, while ‘suffering from an acute lack of confidence in their own projects’. This lack of authority, this social pessimism, they now effectively impose on others through a ‘moralising intervention into every aspect of private life’. But while the description of this new period will, no doubt, connect with many, Gibson – possibly by trying to cover too much, including passing references to Aristotle, Kant and Bentham for good measure – fails to provide a convincing explanation of why this all came about.

Taking his lead from the BBC documentary film producer Adam Curtis, Gibson identifies how the computer modelling of behaviour – and even more bizarrely, of intentions – came to dominate an intelligence world increasingly devoid of purpose or principle. But, as he himself notes, the intelligence community’s embrace of behaviour modelling is just as likely to be an expression of a broader ‘loss of faith in humans’ as the driver of social processes. Today, that loss of faith – and an obsession with risk management – comes to be expressed through the failure to put eyes and ears on the ground, as Gibson’s once were (a job for which he was awarded an MBE), and thereby a failure to verify theory through practice.

In addition, this final chapter makes three significant and unique contributions to improving our understanding and application of intelligence.

Firstly, he argues that the most useful role of intelligence today is to understand the context correctly, without which ‘purpose is equally misguided’. Secondly, and drawing on his most important academic contribution, Gibson notes that, ‘the use of single-source intelligence-reporting drawn from individuals selected principally for their willingness to share secrets…is not the best way to analyse contemporary challenges’, as the illusions about Iraqi ‘weapons of mass destruction’ ably demonstrated.

Whether these new challenges are, as he suggests, those so-called non-traditional security threats, such as climate change, energy supply and food provision, is open to debate. Possibly, it is the ‘dismissal of free will’ and ‘decline into mediocrity’ that he identifies elsewhere that are the real problems. And it is these problems that have turned the essentially technical issues of climate change or food provision into all-consuming sources of uncertainty and insecurity.

Finally, and significantly for one who has made the pursuit of freedom and autonomy central to his existence, Gibson notes the loss of any sense of fun in a politically correct world without an ‘enlightened purposeful ideology around which to cohere’. (This comes from a man who knows something about fun having, in his youth, gatecrashed an international beauty pageant pretending to be Miss Austria’s personal bodyguard.)

Advocates of the ‘purposeless pragmatism’ and ‘bureaucratic regulation’ he now views as the real barrier to achieving ‘prosperity and progress for all’ would no doubt disapprove of Gibson’s youthful antics. It is unlikely, for instance, that they would appreciate the photographs of naked lovers taken from over one kilometre away that he and his colleagues once sent back to Ministry of Defence analysts to show that their equipment was working and that they were maintaining their skills. But, he notes, it is precisely intolerance towards the criticism – and in this case, the mockery – of widely held beliefs that precludes the effective determination of the truth.

Richard Aldrich concludes how ‘Gibson reflects that it takes the passage of time to recognise that one is misled by power’. For those who feel that after the fact is too late and who still hope to shape history rather than merely be carried along by it, it is only through a constantly evolving analysis of present circumstances that such historical cul-de-sacs can avoided.

This book – while not pretending to be any more than a personal memoir of some hitherto less disclosed aspects of the Cold War – serves to remind us of how far we have come since. After it all ended, Gibson concludes that ‘the somewhat hasty, undignified and testy disintegration of the Mission was intrinsically due to the absence of mission itself’.

This may well explain why – as was revealed from Kremlin minutes released some 20 years after the Berlin Wall came down – Western leaders were so keen at the time to remind the Soviet Union’s then-president, Mikhail Gorbachev, that they really did not want a re-unified Germany. Despite their pro-freedom stance and rhetoric, a unified Germany would, in the words of Margaret Thatcher, ‘undermine the stability of the whole international situation and could endanger our security’.

Indeed, the Cold War may well have been the last time that Britain and the other Western powers could even pretend to have had a clear and positive sense of mission.

First published on spiked, 30 March 2012

The changing nature of riots in the contemporary metropolis from ideology to identity: lessons from the recent UK riots

Abstract: Whereas past episodes of rioting in UK cities confronted the state authorities with a conscious and collective political problem – either through opposition to job losses or to institutional racism – in the post-political climate today we witness a shift towards individual action driven more by identity than by ideology. The one element that united the otherwise disaggregated rioters across the UK recently was more their taste in expensive sportswear (branded trainers) and electrical goods (plasma television screens) than anything else. Far from being a backlash against the police shooting of a petty, local black criminal in north London, or to the austerity measures introduced by the Liberal-Conservative government to combat the UK state deficit, some commentators suggest that what we now see is the product of a generation brought up on welfare for whom the old allegiances of work, family and community have lost their meaning and who, accordingly, are only able to assert their identity through the expression of their consumer tastes. This article examines what really drove the recent UK riots and explores the twin crises – of authority and of identity that they have exposed.

The Changing Nature of Riots in the Contemporary Metropolis from Ideology to Identity: Lessons from the Recent UK Riots, Journal of Risk Research (Impact Factor 1.027 ISI 2015), Vol. 15, No. 4, pp.347-354, May 2012

How CSR became big business

Corporate social responsibility allows governments to avoid accountability and gives companies a sense of purpose.

Whenever society faces a crisis there tends to be a wave of moralism. So it is not surprising that, as the private-equity crisis has transformed into the public-debt calamity, there is now much discussion about the correct conduct of business and finance.

The last time such a significant conversation occurred on these matters was in the mid-1990s. Back then, economic turmoil and the dramatic downfalls of corporations and businessmen like BCCI, Polly Peck and Robert Maxwell – all tainted by accusations of fraud – led to the promotion of ‘corporate social responsibility’ (CSR). The ideas behind this concept were articulated in a landmark inquiry by the Royal Society for the Encouragement of Arts (RSA), Tomorrow’s Company: The Role of Business in a Changing World. It seems fitting, therefore, that the RSA’s current chief executive, Matthew Taylor, recently sought to articulate his vision for ‘enlightened enterprise’, laying out how ‘business can combine a strategy for competitive success with a commitment to social good’.

Looking back, though, it seems many of the corporate contributors to the original study might have been good at talking the CSR talk, but they were considerably less interested in, or capable of, walking the CSR walk.

Quite a few of the companies, including British Gas, British Airways and National Grid, were relatively recent creations of the privatisation boom under the previous Conservative administration. In their cases it is reasonable to suppose that their chief executives were keen to get behind the calls for change. Many others, such as electronics company Thorn EMI, transport and logistics firm Ocean Group, and the IT company FI Group, got caught up in the late-1990s wave of mergers and acquisitions, and so ended up being subsumed or disappearing entirely. No doubt, quite a few individuals got rich in the process.

Some of the original supporters of CSR – like The Strategic Partnership (London) Ltd – were more like tiny, shoestring-budget quangos, staffed by individuals whose intended policy clout far exceeded their business significance. At the other end of the spectrum, among those who pontificated about what makes a responsible company, were the leaders of Barings Venture Partners Ltd. Barings Bank collapsed in 1995 after one of its employees, Nick Leeson, lost £827 million due to speculative investing. So much for being responsible.

Tomorrow’s Company was a product of its time. Bemoaning the absence of non-financial measures for business success, it fed into the growing demand for procedural audits and targets that were to become one of the emblematic pledges of the New Labour government. And, in what was to become typical New Labour lingo, the inquiry demanded greater ‘dialogue’ and ‘inclusivity’.

The RSA inquiry complained of the ‘adversarial culture’ of the business world. This heralded later attacks on various supposed institutional cultures, including the ‘canteen culture’ of the police force, critiqued in the 1999 Stephen Lawrence inquiry, and the ‘club culture’ of the medical profession, lambasted in the 2001 Bristol Royal Infirmary inquiry. There have also been critiques of the army’s ‘barracks culture’ and, more recently, of the ‘macho culture’ of the International Monetary Fund (IMF). This was following the controversy involving the former IMF chief, Dominique Strauss-Kahn, who was claimed to have been protected by a French ‘culture of secrecy’.

The meaning of CSR today

Matthew Taylor, in his recent exposition of ‘enlightened enterprise’, also asks for a ‘shift in our national culture’. But whereas the 1995 RSA study called for change in response to the ‘increasingly complex, global and interdependent’ conditions within which businesses were allegedly operating, for Taylor the key problem to be corrected is human nature.

‘[H]uman beings are complex social animals’, he suggested in a recent speech, ‘influenced more by our nature and context and less by calculating, conscious decisions, than we intuitively believe’. Like other adherents of the new orthodoxies of behavioural economics and evolutionary psychology, Taylor talks of the need to create ‘more capable and responsible citizens’.

So what does all this have to do with business behaviour? One important clue was provided by Mark Goyder, programme director of the original RSA inquiry. He brought up ‘the notion of business as the most important agent of social change, in an age when governments are redefining and limiting their own sphere of influence’. Taylor, for his part, identified the idea of behaviour change as a key aspect of corporate responsibility and explained that the Lib-Con coalition has set up its own behaviour-change unit and that ‘the idea that we need to move from a government-centric to a people-centric model of social change is central to David Cameron’s vision of a Big Society’.

Against the backdrop of these two elements – the changing role of government and the view of ordinary people as little more than natural impulses on legs, as beings who need to be nudged into changing their behaviour – the new role of business becomes transparently clear. Businesses are to act on behalf of governments that can’t be trusted and for people who don’t know what’s good for them.

Taylor is quite explicit about this. ‘[T]he state’, he noted, ‘has many competing objectives and when it uses its power to nudge it opens itself up to charges of paternalism and social engineering’. Businesses, however, have the ability ‘to build on a relationship based on choice and consent, and in some cases a good degree of trust’. All these qualities are presumably no longer to be expected, or demanded, from government.

No doubt, many in the business community will jump at this invitation to take over the levers of power by acting as de facto school prefects on behalf of states that no longer want, or cannot be trusted, to rule. Many will also be excited by the ability to play an ever bigger role in the government’s nudge agenda and to take on the mantle of responsible agents for change.

From profits and growth to ‘performance with purpose’

Today, CSR is big business. And so the success of enterprise in this age is not to have the spirit that took people to the Moon, but to play a part in slimming waistlines and reducing carbon footprints. It’s simply a question of ‘selling the right stuff’, as PepsiCo’s CEO Indra Nooyi has put it. Nooyi has committed her company to ‘performance with purpose’, which includes providing healthy snacks. Likewise, the Mars Corporation’s new focus has little in common with the bold ambitions of the space-race era. It now wants to concentrate on selling products ‘as part of a balanced diet’ and on encouraging people to get ‘fit not fat’.

Taylor is aware of the possibility that not all of us would choose to pursue the ideals that he and his fellow nudge-enthusiasts espouse. To counteract this, business has to take the lead, ‘prompted by NGOs in a sense acting as quasi-regulators and intermediaries with consuming households’. The RSA has taken the lead in this respect, working with Shell and taxi drivers to make fuel-efficient behaviour more habitual.

Ultimately, Taylor comes across as gullible for buying into the idea that corporations want to put social responsibility first. He even cites ‘Flora’s cholesterol-cutting margarine’ as a service in protecting people’s health. This despite the fact that Flora’s claims are highly dubious, and the purported link between high cholesterol and heart disease is increasingly disputed and discredited. Perhaps Taylor will be promoting anti-ageing creams next?

A major error of CSR proponents is to assume that the key determinants of success for businesses and their employees is not making money, but being fulfilled in some other way. Taylor cited a Gallup survey which showed that ‘beyond obvious basic factors like health and a reasonable income, the key determinant of whether someone described themselves as thriving in their lives as a whole was whether they saw their employer – or manager – as a partner rather than a boss’. Here, he sounds rather like one of his predecessors at the RSA, Charles Handy, who, in answer to the question ‘What is a company for?’, said: ‘To talk of profits is no answer because I would say “of course, but profits to what further purpose?”’

CSR: the displacement of responsibility

But real profits, good health and reasonable incomes cannot so readily be assumed. They still have to be achieved, and cannot just be dismissed as ‘obvious’ in a desire to promote a new business agenda. In fact, the CSR agenda has helped businesses get away with ignoring the self-expressed needs of its employees. British Airways, for instance, was commended for its social and environmental reports while simultaneously undermining working conditions for its staff.

In fact, the most ideal CSR scheme focuses its supposed benefits elsewhere – typically it is directed at poor people ‘without a voice’ or better still on animals or the environment that can’t talk back at all. That way, businesses can offer token sums and gestures to impoverished communities and satisfy eco-activists and their media groupies at the same time – all the while compelling staff to subsidise the schemes by volunteering their own time and energies.

It is not at all obvious what it is about businesses, and still less self-styled civil-society groups and NGOs, that makes them legitimate representatives of the public’s needs. For the government, recruiting business to its behaviour-change agenda seems like a further evasion of accountability. Ultimately, whatever companies say about putting people and the planet before profit, they only ever have a partial view of the world.

Only states have, and are authorised by the sovereign people to promote a more universal view. Whether society should be aiming for healthy living, sustainability or anything else should be part of a broad, democratic discussion, not sneakily foisted upon us by businesses acting under the guidance of NGOs, policy wonks or ministers looking for ways to show they’ve ‘made a difference’.

The original advocates of CSR focused their attention on culture, as do their supposedly more people-centric descendants, because it is at this level – the level of the informal relationships between people – that the potential for contestation and resolution initially emerges. This can be a messy business, and one that states that doubt their own direction and purpose are loathe to engage in. They would rather outsource this messy function to others, and attempt to replace all those informal, uncertain and uncontrollable interactions with more predictable formal codes, regulations and responsibilities. That they find willing lap-dogs for this in the ethereal world of think tanks, as well as businesses that are suffering from their own crisis of confidence, is not that surprising.

However, if we truly want to change the world then it is ordinary people who will have to assert what really matters to them. CSR – it has been noted by many – is invariably a by-product of business success, not the cause of it. Likewise, it is people’s aspirations for a better world – however we imagine it – that should be the only prompt for the kind of behaviour we adopt.

First published on spiked, 2 November 2011

Message to the West: ‘know thyself’

Since 9/11, terrorists have lived like parasites off the already-existing disorientation of Western elites.

In his opening remarks to the latest US National Strategy for Counterterrorism, released in June, President Barack Obama notes that ‘to defeat al-Qaeda, we must define with precision and clarity who we are fighting’.

Ten years on from 9/11, then, it would appear that one of the key protagonists in what used to be known as the ‘war on terror’ (subsequently rebranded as the ‘long war against terrorism’ and now simply redefined as a ‘war on al-Qaeda’) is still busy attempting to identify and understand its enemy.

This speaks volumes about where the fundamental difficulties of the past decade, as well as the next one, may lie. For all the much-vaunted differences with his predecessor, President Obama comes across as just as confused as George W Bush. At a time when 9/11 was probably still just a twinkle in Osama bin Laden’s eye, Bush Jr addressed an audience at Iowa Western Community College, as part of his election campaigning. He expounded that: ‘When I was coming up, it was a dangerous world, and you knew exactly who they were. It was Us vs Them, and it was clear who Them was. Today, we are not so sure who they are, but we know they’re there.’

Perhaps both presidents Bush and Obama should have visited the fabled Temple of Apollo at Delphi, where Ancient Greek warriors consulted the oracle in advance of engaging in a protracted conflict. In the temple forecourt the presidents could read the infamous inscription: ‘Know thyself.’

For 10 years, the world’s sole superpower has allowed one of its key strategies to be defined for it, and has also allowed itself to be buffeted around as its understanding of who the enemy is continually changed. As its locus of interest has shifted relentlessly – from terrorists and terrorism to states that may harbour terrorists to technologies that might facilitate terror – so America has consistently and unwittingly advertised to the world that wherever the ‘they’ lead, the US follows.

This is the very opposite of strategic vision. Such vision would require knowing what you are for, what your aims and ambitions are, even in the absence of having to respond to the presumed threats posed by external forces. Knowing your enemy is, of course, a necessity, but the primary task for any nation is to be clear about its own interests in the first place.

And so it is precisely a better understanding of Western culture – its conflicts and contradictions – that might have helped the US authorities appreciate the extent to which the trajectory they were about to embark on was born of their own internal confusions and incoherence.

Osama bin Laden, Ayman al-Zawahiri and those who have sought to emulate them have also spoken of an inchoate rage against modernity that rapidly eclipsed the various Western anti-globalisation movements of the time. For all their purported claims to be representatives of others in the South and the East, the most striking thing about bin Laden et al was the extent to which their ideas were largely Western in origin. While being mindful to dress themselves and their language in Islamic garb, their complaints were predictable and had been well-rehearsed by others in the West. As I have put it before, ‘Islam was their motif, not their motive’.

Sadly, by imbuing these people’s puerile and purposeless violence with deeper meaning – to the point of even describing it as an ideology or an understandable reaction – countless international analysts both effectively absolved those involved of responsibility for their actions and helped encourage others to follow their lead.

But what these analysts often missed is that while the ‘war on terror’ may be 10 years old, for its real origins we need to go back at least another 15 years. In the mid-1980s the then Soviet president, Mikhail Gorbachev, appeared dramatically to alter the rules of the Cold War through promoting the twin policies of glasnost and perestroika. He had little choice if he was to delay and soften the blow of his country’s impending implosion. The consequences were to prove just as dramatic for the West as for the East.

It was in this period – before the collapse of the Berlin Wall, and while the CIA was still busy training the Mujahideen to assist them in rebutting the occupying Soviet forces in Afghanistan – that the need for Western elites to reorganise their own systems and ideologies first emerged. By the time Francis Fukuyama was celebrating ‘The End of History’ it was already becoming clear that the only force that had held conservative elites across the world together during the Cold War period was the supposed twin threat posed by Soviet Marxism and internal state socialism.

Without these forces, the old political right rapidly suffered intellectual exhaustion and then disintegrated, leaving the future up for grabs. In the 1990s there was a constant search for new enemies against which states – in danger of losing their own meaning and purpose – could cohere themselves.

But none of the new litany of demons – from the Contras in Nicaragua or General Aideed in Somalia, from Slobodan Milosevic in the former Yugoslavia to Saddam Hussein in Iraq – could really live up to the caché of the military and material urgency that had been imposed by the Red Army. Ethical foreign policy came and went – invented by Tony Blair’s New Labour government and adopted later by the Bush administration.

It was in this period that the old remnants of the left, fearful of being consigned to the dustbin of history, embraced both the environmental movement and the politics of risk and precaution as a way of gaining legitimacy. Originally formulated in relation to addressing ecological problems, this rapidly spread to issues pertaining to public health, consumer safety and beyond. It provided a cohering framework in an age without one. And a key element of the precautionary outlook then being developed was the need for public officials to project worst-case scenarios for society to respond to.

The German sociologist Ulrich Beck’s 1987 bestseller Risk Society was translated into English in 1992 and rapidly gained traction through its ability to reflect the emerging mood and policies.

An outlook shaped on the fringes of local authorities and supra-national bodies of marginal relevance soon became the new organising principle of the West. And what had, until then, been largely dismissed as an exercise in left-wing political correctness by the old right, was catapulted and transmogrified through the tragic events of 9/11.

Unwittingly, then, the new terrorists were both a product of these confusions, as well as inadvertently providing the authorities with a flimsy new purpose. Criticism of the West had long been around, but never before had it taken such a degraded form as in this post-political age.

In any other previous period of history, the actions of the Islamic radicals ought at best to have featured as minor disturbances in the footnotes of history. Only in an age schooled in presuming the worst in all eventualities could such mindless violence come to be seen as full of meaning and requiring an all-consuming response.

Ultimately, extremists are merely the extreme expression of mainstream ideas. Their ideas have to come from somewhere. And looking around at the dominant thinking of the post-Cold War world order, it is not too difficult to identify where some of the sources are.

Increasingly, we have become accustomed to presuming that we live in a peculiarly dangerous and uncertain age. Globalisation, which provides most of the benefits we often unconsciously enjoy, has come to be portrayed as the amoral steamroller and destroyer of humanity and history. Human beings are increasingly depicted as being both violent and degraded, as suffering from arrogance and ignorance, or as hapless and vulnerable victims needing constant therapeutic support by a range of experts. Little wonder that such a small coterie of fools, the terrorists who espoused these ideas in an extreme form, could have such strong purchase.

But by overemphasising the extremes, as we are now prone to do, we simultaneously underestimate the significance of the mainstream. Black swans happen but white swans remain far more frequent, and drift can be just as disabling as shock, if not more so.

The Enron crisis occurred at about the same time as 9/11 – and it also cost significantly more. This was soon followed by the collapse of Worldcom, and, years later, the 2008 world economic crash happened. Yet unlike other problems that have emerged over this period, there was never quite the same sense of urgency in addressing these issues. Maybe that’s because, at some deeper level, many world leaders know that they cannot be tackled without significantly more far-reaching measures that, despite the culture of precaution, they have studiously sought to avoid.

Despite the billions of dollars expended on the ‘war on terror’ thus far, the US and others are still far from understanding, not just what it is they think they are up against, but also themselves. Without such an understanding there can be little hope for positive progress and development. In the past, some believed they suffered from a US that was too confident and assertive in the world. Today, we see the legacy of a US that is both confused and ambivalent. And we don’t seem to be any better off.

First published on spiked, 8 September 2011

Reconciling growing energy demand with climate change management

Introduction

More than two billion people in India and China are only now emerging from a life of drudgery and abject poverty.’ A billion more across sub-Saharan Africa, Latin America and other parts of Asia look set to join them over the next decades. This should be a cause for celebration. Instead, much of the contemporary discussion relating to energy needs and climate change portray these trends as a major problem.

The 2009 United Nations Climate Change Conference in Copenhagen was hailed in advance as reflecting an ‘overwhelming scientific consensus’ on the assumed problem of a link between carbon emissions and climate change, as well as on what needed to be done about it. But instead of agreement there was discord between the developed and the developing nations. The former argued that the latter should monitor and restrain their growth as they view with a growing sense of alarm the possibility of every Indian and Chinese person expecting Western lifestyles. They pointed to China now being the second largest producer of carbon emissions on earth.

From the perspective of the developing countries, however, as expressed by the Indian premier, Manmohan Singh, their growth and development is to meet internal needs and demands, as well as simply to catch up with the West. Their view is that the advanced capitalist countries had the benefit of industrialising first – thereby releasing into the atmosphere the carbon that is now considered to be a problem. Accordingly, it should be for those countries to lead the way in cutting back on emissions. And anyway, in terms of per capita emissions, it is these developed Western countries that remain the single largest polluters.

It appears, then, that the debate over how to meet growing energy demands and manage climate change has reached an impasse. It is difficult to see how, within the current framework, the different perspectives of developed and developing countries can ever be reconciled or resolved.

Reconciling Growing Energy Demand with Climate Change Management, Global Change, Peace & Security, Vol. 23, No. 2, pp.271-282, June 2011