The spy who came in from the Cold War

The Red Army toilet-raiding realities of spying certainly exhilarated Steve Gibson, but the fall of the Berlin Wall brought doubt, too.

From the end of the Second World War through to the end of the Cold War, a little-known unit of British special forces conducted spying missions behind the Iron Curtain – that is, right from the heart of Soviet-occupied East Germany.

Called the British Commanders’-in-Chief Mission to the Group Soviet Forces of Occupation in Germany, or BRIXMIS for short, it was part of an officially sanctioned exchange of observers between the Red Army and the British Army established by the victorious Allied powers and the USSR through the Robertson-Malinin agreement in 1946. Its ostensible purpose was to improve communication and relations between them.

In addition to BRIXMIS – and their French and American counterparts in the East – the Red Army also conducted similar operations through a unit in West Germany. But, diplomatic liaison and translation duties aside, the real purpose of these units soon became clear: to find out what each other was up to by heading out into those areas where they had been specifically told not to go.

My friend and former UK Defence Academy colleague Steve Gibson led many of these ‘tours’ just as the Cold War was coming to an end. Live and Let Spy is his gripping recollection of these episodes. Although originally published in 1997 as The Last Mission: Behind the Iron Curtain, it has now been republished and augmented some 15 years later with a significant additional chapter written with the hindsight gained during his subsequent academic career.

Much of the espionage involved gathering evidence about the weaponry available to the Red Army. Accordingly, it typically required lying in wait on a bridge over a railway line at three o’clock in the morning in the middle of a forest in winter. With temperatures dipping to around minus-30 degrees celsius, the objective was to photograph all the kit that passed by on a train underneath. Alternatively, they might record the rate of fire from Russian guns from the safety of their locked vehicle in the sweltering 40 degree heat of summer.

Sent back to the Defence Intelligence Staff in Whitehall, this information allowed specialists to determine troop and equipment levels, as well as whether a new bolt on a gun or aerial on a tank might allow it to fire or communicate further than previously estimated – and if so, whether this would necessitate the complete re-evaluation of NATO’s Cold War battle plans.

Of course, the operations required meticulous planning to identify suitably concealed observation posts, as well as efficient access and escape routes. This planning was usually conducted during the day. For those so disposed, there are sufficient ‘tradecraft’ details here to sustain interest. For me, however, the real gem is the lesson identified early in the book – to be as conspicuous as possible by waving at everyone.

This waving tactic disoriented many into believing that the Mercedes G-Class (Geländewagen) passing by with three individuals in army fatigues in it was legitimate. And even if observers suspected something, the fact that nearby children would invariably wave back – raising the possibility that those inside were known to them – would add further confusion or delay. Those who did smell a rat usually did so too late.

It was not just a jolly jaunt. Over the years, a number of tour personnel lost their lives or suffered serious injury through being shot at or having Russian tanks ram their vehicles. East German ‘narks’ were also always on the look-out for anyone in the wrong place and would report these to the relevant authorities. It is noted, though, that many local ‘Easties’ were keen to help the agents.

Given the challenging circumstances, selection and training were intense and severe. It required individuals who could think quickly on their feet and not just expect to follow rules. It also meant having the ability to complete advanced courses in Russian and German, photography and navigation in next to no time, and to memorise the look and sound of countless pieces of Soviet military equipment, as well as remain calm – yet sharp – when tired or provoked.

For anyone who imagines that spying is glamorous, or somehow akin to being in a Bond movie, they will be disabused by Gibson’s chapter on document-gathering from dumps (literally). It had been recognised for some time that, when they went on manoeuvres in East Germany, the Soviet forces were not supplied with any toilet paper. They would use whatever came to hand – a copy of Pravda, a letter from a loved one, or even their mission papers. And after they were done, it was then that Her Majesty’s specially trained and equipped Cold War warriors really came into their own…

The book is a tour de force of teeth-clenching tension that will keep most readers gripped from beginning to end. But while the first nine chapters retain the action-packed core of the original narrative, filled with the escapades of small teams of rather special individuals trying to find out what the Soviets were up to, the real substance – for those of a more political disposition – is a chapter titled ‘Reflections’.

As a professor of political science at the University of Warwick, Robert Aldrich, notes in the new foreword, Gibson is now clearly of the mind that ‘much of what [he] was led to believe [during the Cold War], and some of what he was told, was simply wrong!’

It is testimony to the author’s strength of character that – unlike others – he neither chose to dwell in the past nor fell prone to the ‘invention of illness’. This latter problem, he himself notes, affected many of his one-time colleagues once their personal and moral frameworks disintegrated with the end of the old, Cold War world order.

Gibson’s resolute clearsightedness is to be admired. So despite having been caught up in the exhilaration of it all as a young man, despite devoting the prime of his life to the East-West conflict, he refuses to lie to himself. ‘The Cold War’, he notes, ‘was a giant historical cul-de-sac where all enlightened efforts at producing a good society were suspended’.

Aldrich astutely summarises a key argument of Live and Let Spy: ‘while Cold War warriors fought a tyrannical and ruthless version of Communism abroad, they remained ignorant of – and lost – an ideological battle at home’. He then adds accusingly: ‘Western politicians now offer a watered-down version of the interfering, intolerant, controlling and authoritarian government that they were initially set against rather than anything freer.’

In this, he takes his lead from Gibson, who rails against the erosion of ‘moral values, community spirit and sense of purpose’ that now pervades Western political elites. They are ‘pessimistic and misanthropic’, Gibson argues, while ‘suffering from an acute lack of confidence in their own projects’. This lack of authority, this social pessimism, they now effectively impose on others through a ‘moralising intervention into every aspect of private life’. But while the description of this new period will, no doubt, connect with many, Gibson – possibly by trying to cover too much, including passing references to Aristotle, Kant and Bentham for good measure – fails to provide a convincing explanation of why this all came about.

Taking his lead from the BBC documentary film producer Adam Curtis, Gibson identifies how the computer modelling of behaviour – and even more bizarrely, of intentions – came to dominate an intelligence world increasingly devoid of purpose or principle. But, as he himself notes, the intelligence community’s embrace of behaviour modelling is just as likely to be an expression of a broader ‘loss of faith in humans’ as the driver of social processes. Today, that loss of faith – and an obsession with risk management – comes to be expressed through the failure to put eyes and ears on the ground, as Gibson’s once were (a job for which he was awarded an MBE), and thereby a failure to verify theory through practice.

In addition, this final chapter makes three significant and unique contributions to improving our understanding and application of intelligence.

Firstly, he argues that the most useful role of intelligence today is to understand the context correctly, without which ‘purpose is equally misguided’. Secondly, and drawing on his most important academic contribution, Gibson notes that, ‘the use of single-source intelligence-reporting drawn from individuals selected principally for their willingness to share secrets…is not the best way to analyse contemporary challenges’, as the illusions about Iraqi ‘weapons of mass destruction’ ably demonstrated.

Whether these new challenges are, as he suggests, those so-called non-traditional security threats, such as climate change, energy supply and food provision, is open to debate. Possibly, it is the ‘dismissal of free will’ and ‘decline into mediocrity’ that he identifies elsewhere that are the real problems. And it is these problems that have turned the essentially technical issues of climate change or food provision into all-consuming sources of uncertainty and insecurity.

Finally, and significantly for one who has made the pursuit of freedom and autonomy central to his existence, Gibson notes the loss of any sense of fun in a politically correct world without an ‘enlightened purposeful ideology around which to cohere’. (This comes from a man who knows something about fun having, in his youth, gatecrashed an international beauty pageant pretending to be Miss Austria’s personal bodyguard.)

Advocates of the ‘purposeless pragmatism’ and ‘bureaucratic regulation’ he now views as the real barrier to achieving ‘prosperity and progress for all’ would no doubt disapprove of Gibson’s youthful antics. It is unlikely, for instance, that they would appreciate the photographs of naked lovers taken from over one kilometre away that he and his colleagues once sent back to Ministry of Defence analysts to show that their equipment was working and that they were maintaining their skills. But, he notes, it is precisely intolerance towards the criticism – and in this case, the mockery – of widely held beliefs that precludes the effective determination of the truth.

Richard Aldrich concludes how ‘Gibson reflects that it takes the passage of time to recognise that one is misled by power’. For those who feel that after the fact is too late and who still hope to shape history rather than merely be carried along by it, it is only through a constantly evolving analysis of present circumstances that such historical cul-de-sacs can avoided.

This book – while not pretending to be any more than a personal memoir of some hitherto less disclosed aspects of the Cold War – serves to remind us of how far we have come since. After it all ended, Gibson concludes that ‘the somewhat hasty, undignified and testy disintegration of the Mission was intrinsically due to the absence of mission itself’.

This may well explain why – as was revealed from Kremlin minutes released some 20 years after the Berlin Wall came down – Western leaders were so keen at the time to remind the Soviet Union’s then-president, Mikhail Gorbachev, that they really did not want a re-unified Germany. Despite their pro-freedom stance and rhetoric, a unified Germany would, in the words of Margaret Thatcher, ‘undermine the stability of the whole international situation and could endanger our security’.

Indeed, the Cold War may well have been the last time that Britain and the other Western powers could even pretend to have had a clear and positive sense of mission.

First published on spiked, 30 March 2012

How CSR became big business

Corporate social responsibility allows governments to avoid accountability and gives companies a sense of purpose.

Whenever society faces a crisis there tends to be a wave of moralism. So it is not surprising that, as the private-equity crisis has transformed into the public-debt calamity, there is now much discussion about the correct conduct of business and finance.

The last time such a significant conversation occurred on these matters was in the mid-1990s. Back then, economic turmoil and the dramatic downfalls of corporations and businessmen like BCCI, Polly Peck and Robert Maxwell – all tainted by accusations of fraud – led to the promotion of ‘corporate social responsibility’ (CSR). The ideas behind this concept were articulated in a landmark inquiry by the Royal Society for the Encouragement of Arts (RSA), Tomorrow’s Company: The Role of Business in a Changing World. It seems fitting, therefore, that the RSA’s current chief executive, Matthew Taylor, recently sought to articulate his vision for ‘enlightened enterprise’, laying out how ‘business can combine a strategy for competitive success with a commitment to social good’.

Looking back, though, it seems many of the corporate contributors to the original study might have been good at talking the CSR talk, but they were considerably less interested in, or capable of, walking the CSR walk.

Quite a few of the companies, including British Gas, British Airways and National Grid, were relatively recent creations of the privatisation boom under the previous Conservative administration. In their cases it is reasonable to suppose that their chief executives were keen to get behind the calls for change. Many others, such as electronics company Thorn EMI, transport and logistics firm Ocean Group, and the IT company FI Group, got caught up in the late-1990s wave of mergers and acquisitions, and so ended up being subsumed or disappearing entirely. No doubt, quite a few individuals got rich in the process.

Some of the original supporters of CSR – like The Strategic Partnership (London) Ltd – were more like tiny, shoestring-budget quangos, staffed by individuals whose intended policy clout far exceeded their business significance. At the other end of the spectrum, among those who pontificated about what makes a responsible company, were the leaders of Barings Venture Partners Ltd. Barings Bank collapsed in 1995 after one of its employees, Nick Leeson, lost £827 million due to speculative investing. So much for being responsible.

Tomorrow’s Company was a product of its time. Bemoaning the absence of non-financial measures for business success, it fed into the growing demand for procedural audits and targets that were to become one of the emblematic pledges of the New Labour government. And, in what was to become typical New Labour lingo, the inquiry demanded greater ‘dialogue’ and ‘inclusivity’.

The RSA inquiry complained of the ‘adversarial culture’ of the business world. This heralded later attacks on various supposed institutional cultures, including the ‘canteen culture’ of the police force, critiqued in the 1999 Stephen Lawrence inquiry, and the ‘club culture’ of the medical profession, lambasted in the 2001 Bristol Royal Infirmary inquiry. There have also been critiques of the army’s ‘barracks culture’ and, more recently, of the ‘macho culture’ of the International Monetary Fund (IMF). This was following the controversy involving the former IMF chief, Dominique Strauss-Kahn, who was claimed to have been protected by a French ‘culture of secrecy’.

The meaning of CSR today

Matthew Taylor, in his recent exposition of ‘enlightened enterprise’, also asks for a ‘shift in our national culture’. But whereas the 1995 RSA study called for change in response to the ‘increasingly complex, global and interdependent’ conditions within which businesses were allegedly operating, for Taylor the key problem to be corrected is human nature.

‘[H]uman beings are complex social animals’, he suggested in a recent speech, ‘influenced more by our nature and context and less by calculating, conscious decisions, than we intuitively believe’. Like other adherents of the new orthodoxies of behavioural economics and evolutionary psychology, Taylor talks of the need to create ‘more capable and responsible citizens’.

So what does all this have to do with business behaviour? One important clue was provided by Mark Goyder, programme director of the original RSA inquiry. He brought up ‘the notion of business as the most important agent of social change, in an age when governments are redefining and limiting their own sphere of influence’. Taylor, for his part, identified the idea of behaviour change as a key aspect of corporate responsibility and explained that the Lib-Con coalition has set up its own behaviour-change unit and that ‘the idea that we need to move from a government-centric to a people-centric model of social change is central to David Cameron’s vision of a Big Society’.

Against the backdrop of these two elements – the changing role of government and the view of ordinary people as little more than natural impulses on legs, as beings who need to be nudged into changing their behaviour – the new role of business becomes transparently clear. Businesses are to act on behalf of governments that can’t be trusted and for people who don’t know what’s good for them.

Taylor is quite explicit about this. ‘[T]he state’, he noted, ‘has many competing objectives and when it uses its power to nudge it opens itself up to charges of paternalism and social engineering’. Businesses, however, have the ability ‘to build on a relationship based on choice and consent, and in some cases a good degree of trust’. All these qualities are presumably no longer to be expected, or demanded, from government.

No doubt, many in the business community will jump at this invitation to take over the levers of power by acting as de facto school prefects on behalf of states that no longer want, or cannot be trusted, to rule. Many will also be excited by the ability to play an ever bigger role in the government’s nudge agenda and to take on the mantle of responsible agents for change.

From profits and growth to ‘performance with purpose’

Today, CSR is big business. And so the success of enterprise in this age is not to have the spirit that took people to the Moon, but to play a part in slimming waistlines and reducing carbon footprints. It’s simply a question of ‘selling the right stuff’, as PepsiCo’s CEO Indra Nooyi has put it. Nooyi has committed her company to ‘performance with purpose’, which includes providing healthy snacks. Likewise, the Mars Corporation’s new focus has little in common with the bold ambitions of the space-race era. It now wants to concentrate on selling products ‘as part of a balanced diet’ and on encouraging people to get ‘fit not fat’.

Taylor is aware of the possibility that not all of us would choose to pursue the ideals that he and his fellow nudge-enthusiasts espouse. To counteract this, business has to take the lead, ‘prompted by NGOs in a sense acting as quasi-regulators and intermediaries with consuming households’. The RSA has taken the lead in this respect, working with Shell and taxi drivers to make fuel-efficient behaviour more habitual.

Ultimately, Taylor comes across as gullible for buying into the idea that corporations want to put social responsibility first. He even cites ‘Flora’s cholesterol-cutting margarine’ as a service in protecting people’s health. This despite the fact that Flora’s claims are highly dubious, and the purported link between high cholesterol and heart disease is increasingly disputed and discredited. Perhaps Taylor will be promoting anti-ageing creams next?

A major error of CSR proponents is to assume that the key determinants of success for businesses and their employees is not making money, but being fulfilled in some other way. Taylor cited a Gallup survey which showed that ‘beyond obvious basic factors like health and a reasonable income, the key determinant of whether someone described themselves as thriving in their lives as a whole was whether they saw their employer – or manager – as a partner rather than a boss’. Here, he sounds rather like one of his predecessors at the RSA, Charles Handy, who, in answer to the question ‘What is a company for?’, said: ‘To talk of profits is no answer because I would say “of course, but profits to what further purpose?”’

CSR: the displacement of responsibility

But real profits, good health and reasonable incomes cannot so readily be assumed. They still have to be achieved, and cannot just be dismissed as ‘obvious’ in a desire to promote a new business agenda. In fact, the CSR agenda has helped businesses get away with ignoring the self-expressed needs of its employees. British Airways, for instance, was commended for its social and environmental reports while simultaneously undermining working conditions for its staff.

In fact, the most ideal CSR scheme focuses its supposed benefits elsewhere – typically it is directed at poor people ‘without a voice’ or better still on animals or the environment that can’t talk back at all. That way, businesses can offer token sums and gestures to impoverished communities and satisfy eco-activists and their media groupies at the same time – all the while compelling staff to subsidise the schemes by volunteering their own time and energies.

It is not at all obvious what it is about businesses, and still less self-styled civil-society groups and NGOs, that makes them legitimate representatives of the public’s needs. For the government, recruiting business to its behaviour-change agenda seems like a further evasion of accountability. Ultimately, whatever companies say about putting people and the planet before profit, they only ever have a partial view of the world.

Only states have, and are authorised by the sovereign people to promote a more universal view. Whether society should be aiming for healthy living, sustainability or anything else should be part of a broad, democratic discussion, not sneakily foisted upon us by businesses acting under the guidance of NGOs, policy wonks or ministers looking for ways to show they’ve ‘made a difference’.

The original advocates of CSR focused their attention on culture, as do their supposedly more people-centric descendants, because it is at this level – the level of the informal relationships between people – that the potential for contestation and resolution initially emerges. This can be a messy business, and one that states that doubt their own direction and purpose are loathe to engage in. They would rather outsource this messy function to others, and attempt to replace all those informal, uncertain and uncontrollable interactions with more predictable formal codes, regulations and responsibilities. That they find willing lap-dogs for this in the ethereal world of think tanks, as well as businesses that are suffering from their own crisis of confidence, is not that surprising.

However, if we truly want to change the world then it is ordinary people who will have to assert what really matters to them. CSR – it has been noted by many – is invariably a by-product of business success, not the cause of it. Likewise, it is people’s aspirations for a better world – however we imagine it – that should be the only prompt for the kind of behaviour we adopt.

First published on spiked, 2 November 2011

Message to the West: ‘know thyself’

Since 9/11, terrorists have lived like parasites off the already-existing disorientation of Western elites.

In his opening remarks to the latest US National Strategy for Counterterrorism, released in June, President Barack Obama notes that ‘to defeat al-Qaeda, we must define with precision and clarity who we are fighting’.

Ten years on from 9/11, then, it would appear that one of the key protagonists in what used to be known as the ‘war on terror’ (subsequently rebranded as the ‘long war against terrorism’ and now simply redefined as a ‘war on al-Qaeda’) is still busy attempting to identify and understand its enemy.

This speaks volumes about where the fundamental difficulties of the past decade, as well as the next one, may lie. For all the much-vaunted differences with his predecessor, President Obama comes across as just as confused as George W Bush. At a time when 9/11 was probably still just a twinkle in Osama bin Laden’s eye, Bush Jr addressed an audience at Iowa Western Community College, as part of his election campaigning. He expounded that: ‘When I was coming up, it was a dangerous world, and you knew exactly who they were. It was Us vs Them, and it was clear who Them was. Today, we are not so sure who they are, but we know they’re there.’

Perhaps both presidents Bush and Obama should have visited the fabled Temple of Apollo at Delphi, where Ancient Greek warriors consulted the oracle in advance of engaging in a protracted conflict. In the temple forecourt the presidents could read the infamous inscription: ‘Know thyself.’

For 10 years, the world’s sole superpower has allowed one of its key strategies to be defined for it, and has also allowed itself to be buffeted around as its understanding of who the enemy is continually changed. As its locus of interest has shifted relentlessly – from terrorists and terrorism to states that may harbour terrorists to technologies that might facilitate terror – so America has consistently and unwittingly advertised to the world that wherever the ‘they’ lead, the US follows.

This is the very opposite of strategic vision. Such vision would require knowing what you are for, what your aims and ambitions are, even in the absence of having to respond to the presumed threats posed by external forces. Knowing your enemy is, of course, a necessity, but the primary task for any nation is to be clear about its own interests in the first place.

And so it is precisely a better understanding of Western culture – its conflicts and contradictions – that might have helped the US authorities appreciate the extent to which the trajectory they were about to embark on was born of their own internal confusions and incoherence.

Osama bin Laden, Ayman al-Zawahiri and those who have sought to emulate them have also spoken of an inchoate rage against modernity that rapidly eclipsed the various Western anti-globalisation movements of the time. For all their purported claims to be representatives of others in the South and the East, the most striking thing about bin Laden et al was the extent to which their ideas were largely Western in origin. While being mindful to dress themselves and their language in Islamic garb, their complaints were predictable and had been well-rehearsed by others in the West. As I have put it before, ‘Islam was their motif, not their motive’.

Sadly, by imbuing these people’s puerile and purposeless violence with deeper meaning – to the point of even describing it as an ideology or an understandable reaction – countless international analysts both effectively absolved those involved of responsibility for their actions and helped encourage others to follow their lead.

But what these analysts often missed is that while the ‘war on terror’ may be 10 years old, for its real origins we need to go back at least another 15 years. In the mid-1980s the then Soviet president, Mikhail Gorbachev, appeared dramatically to alter the rules of the Cold War through promoting the twin policies of glasnost and perestroika. He had little choice if he was to delay and soften the blow of his country’s impending implosion. The consequences were to prove just as dramatic for the West as for the East.

It was in this period – before the collapse of the Berlin Wall, and while the CIA was still busy training the Mujahideen to assist them in rebutting the occupying Soviet forces in Afghanistan – that the need for Western elites to reorganise their own systems and ideologies first emerged. By the time Francis Fukuyama was celebrating ‘The End of History’ it was already becoming clear that the only force that had held conservative elites across the world together during the Cold War period was the supposed twin threat posed by Soviet Marxism and internal state socialism.

Without these forces, the old political right rapidly suffered intellectual exhaustion and then disintegrated, leaving the future up for grabs. In the 1990s there was a constant search for new enemies against which states – in danger of losing their own meaning and purpose – could cohere themselves.

But none of the new litany of demons – from the Contras in Nicaragua or General Aideed in Somalia, from Slobodan Milosevic in the former Yugoslavia to Saddam Hussein in Iraq – could really live up to the caché of the military and material urgency that had been imposed by the Red Army. Ethical foreign policy came and went – invented by Tony Blair’s New Labour government and adopted later by the Bush administration.

It was in this period that the old remnants of the left, fearful of being consigned to the dustbin of history, embraced both the environmental movement and the politics of risk and precaution as a way of gaining legitimacy. Originally formulated in relation to addressing ecological problems, this rapidly spread to issues pertaining to public health, consumer safety and beyond. It provided a cohering framework in an age without one. And a key element of the precautionary outlook then being developed was the need for public officials to project worst-case scenarios for society to respond to.

The German sociologist Ulrich Beck’s 1987 bestseller Risk Society was translated into English in 1992 and rapidly gained traction through its ability to reflect the emerging mood and policies.

An outlook shaped on the fringes of local authorities and supra-national bodies of marginal relevance soon became the new organising principle of the West. And what had, until then, been largely dismissed as an exercise in left-wing political correctness by the old right, was catapulted and transmogrified through the tragic events of 9/11.

Unwittingly, then, the new terrorists were both a product of these confusions, as well as inadvertently providing the authorities with a flimsy new purpose. Criticism of the West had long been around, but never before had it taken such a degraded form as in this post-political age.

In any other previous period of history, the actions of the Islamic radicals ought at best to have featured as minor disturbances in the footnotes of history. Only in an age schooled in presuming the worst in all eventualities could such mindless violence come to be seen as full of meaning and requiring an all-consuming response.

Ultimately, extremists are merely the extreme expression of mainstream ideas. Their ideas have to come from somewhere. And looking around at the dominant thinking of the post-Cold War world order, it is not too difficult to identify where some of the sources are.

Increasingly, we have become accustomed to presuming that we live in a peculiarly dangerous and uncertain age. Globalisation, which provides most of the benefits we often unconsciously enjoy, has come to be portrayed as the amoral steamroller and destroyer of humanity and history. Human beings are increasingly depicted as being both violent and degraded, as suffering from arrogance and ignorance, or as hapless and vulnerable victims needing constant therapeutic support by a range of experts. Little wonder that such a small coterie of fools, the terrorists who espoused these ideas in an extreme form, could have such strong purchase.

But by overemphasising the extremes, as we are now prone to do, we simultaneously underestimate the significance of the mainstream. Black swans happen but white swans remain far more frequent, and drift can be just as disabling as shock, if not more so.

The Enron crisis occurred at about the same time as 9/11 – and it also cost significantly more. This was soon followed by the collapse of Worldcom, and, years later, the 2008 world economic crash happened. Yet unlike other problems that have emerged over this period, there was never quite the same sense of urgency in addressing these issues. Maybe that’s because, at some deeper level, many world leaders know that they cannot be tackled without significantly more far-reaching measures that, despite the culture of precaution, they have studiously sought to avoid.

Despite the billions of dollars expended on the ‘war on terror’ thus far, the US and others are still far from understanding, not just what it is they think they are up against, but also themselves. Without such an understanding there can be little hope for positive progress and development. In the past, some believed they suffered from a US that was too confident and assertive in the world. Today, we see the legacy of a US that is both confused and ambivalent. And we don’t seem to be any better off.

First published on spiked, 8 September 2011

Reconciling growing energy demand with climate change management

Introduction

More than two billion people in India and China are only now emerging from a life of drudgery and abject poverty.’ A billion more across sub-Saharan Africa, Latin America and other parts of Asia look set to join them over the next decades. This should be a cause for celebration. Instead, much of the contemporary discussion relating to energy needs and climate change portray these trends as a major problem.

The 2009 United Nations Climate Change Conference in Copenhagen was hailed in advance as reflecting an ‘overwhelming scientific consensus’ on the assumed problem of a link between carbon emissions and climate change, as well as on what needed to be done about it. But instead of agreement there was discord between the developed and the developing nations. The former argued that the latter should monitor and restrain their growth as they view with a growing sense of alarm the possibility of every Indian and Chinese person expecting Western lifestyles. They pointed to China now being the second largest producer of carbon emissions on earth.

From the perspective of the developing countries, however, as expressed by the Indian premier, Manmohan Singh, their growth and development is to meet internal needs and demands, as well as simply to catch up with the West. Their view is that the advanced capitalist countries had the benefit of industrialising first – thereby releasing into the atmosphere the carbon that is now considered to be a problem. Accordingly, it should be for those countries to lead the way in cutting back on emissions. And anyway, in terms of per capita emissions, it is these developed Western countries that remain the single largest polluters.

It appears, then, that the debate over how to meet growing energy demands and manage climate change has reached an impasse. It is difficult to see how, within the current framework, the different perspectives of developed and developing countries can ever be reconciled or resolved.

Reconciling Growing Energy Demand with Climate Change Management, Global Change, Peace & Security, Vol. 23, No. 2, pp.271-282, June 2011

H1N1 – The Social Costs of Cultural Confusion

Abstract: In May 2011, the World Health Assembly received the report of its International Health Regulations Review Committee examining responses to the outbreak of the 2009 H1N1 pandemic influenza and identifying lessons to be learnt. This will emphasized the need for better risk communication in the future. But risk and communication are not objective facts; they are socially mediated cultural products. Responses to crises are not simply determined by the situation at hand, but also mental models developed over protracted periods. Accordingly, those forces responsible for promoting the precautionary approach and encouraging the securitization of health, that both helped encourage a catastrophist outlook in this instance, are unlikely to be held to scrutiny. These cultural confusions have come at an enormous cost to society.

H1N1 – The Social Costs of Cultural Confusion, Global Health Governance, Vol. 4, No.2, pp.1-19, June 2011

WHO’s learned nothing from the swine-flu panic?

The over-reaction to H1N1 influenza in 2009 was built on years of waiting for ‘the Big One’.

Over the past few days, the sixty-fourth session of the World Health Assembly (WHA) has been held in Geneva. The WHA is the highest decision-making body of the World Health Organization (WHO). It is comprised of delegations up to ministerial level from the WHO’s 193 constituent member states.

Among the agenda items was to be a discussion of the International Health Regulations 2005 (IHR) in relation to pandemic A (H1N1) 2009 – colloquially known at the time as ‘swine flu’. The IHR first came into force in 2007 and were established to facilitate international cooperation in preventing and responding to acute public-health emergencies, such as the outbreak of influenza that appeared to originate in Mexico two years ago.

The 180-page report, presented by the IHR Review Committee to the WHA, certainly seems impressive. Aside from receiving numerous national and institutional inputs, well over a hundred individuals from a vast array of agencies worldwide, including the WHO, contributed in some form to its findings.

But, in essence, only one point of any note is made in it: ‘Critics assert that WHO vastly overstated the seriousness of the pandemic. However, reasonable criticism can be based only on what was known at the time and not what was later learnt.’ This is felt to be of such significance that it is stated three times – in the executive summary, in a slightly modified form in the body of the text, and again in the conclusions. It is intended as a robust rebuttal to those voices – in the media, the medical professions, and elsewhere – who have questioned the global response to H1N1, and the WHO’s role in shaping this response.

Foremost among these has been Paul Flynn, a British Labour MP and rapporteur to the Social, Health and Family Affairs Committee of the Council of Europe, through which he successfully promoted an inquiry into the matter. This inquiry primarily questioned the role of advisors to the WHO, who – through being employed by large pharmaceutical companies that produce anti-viral drugs and vaccines – were held to have had an economic motivation in raising public concerns about swine flu.

The editor of the British Medical Journal, Fiona Godlee, and others, have similarly pointed to possible conflicts of interests, as well as a lack of transparency within the WHO relating to advice and appointments. Sam Everington, former deputy chair of the British Medical Association, went on the record to argue that, in his opinion, the UK’s chief medical officer and the government were ‘actively scaremongering’.

Quite a number of countries worldwide have also raised criticisms since the pandemic abated, ruing the fact that they purchased vast stocks of vaccines at some considerable cost that have remained unused.

But, just as with the official review of the UK’s response into the outbreak, these voices and views are simply non-existent as far as the IHR Review Committee and the WHO are concerned. And anyway, as the report repeatedly reiterates, it is the considered opinion of international public-health specialists that claims of over-reaction to what turned out to be a comparatively mild illness are misguided. Those who point to this are held to be cavalier and complacent as to the possible risks entailed should the situation have been different.

What’s more, much emphasis is placed in the report on the fact that Margaret Chan, the director-general of the WHO, and other WHO staff consistently tried to calm matters down, repeatedly noting that the overwhelming majority of cases were mild and recommending to governments that there was no need to restrict travel or trade. If anyone went beyond the measures that were officially advocated then the WHO could hardly be held responsible for this, the report contends. Hence it is to the media, and in particular new social media, that blame is attached.

But all this is to woefully misunderstand and underestimate how communication about risk affects contemporary society. Regulations and warnings are not issued into a vacuum. People and institutions do not merely respond to messages on the basis of the precise information contained within them. Rather they interpret these through the prism of their pre-existing cultural frameworks.

For example, when the former UN weapons inspector Hans Blix advised the world in 2002 that he could find no evidence for weapons of mass destruction in Iraq, it is quite clear that, rather than reading this at face value, the response of the US authorities was to assume that any such weapons were simply well hidden. In other words, they did not allow the facts to stand in the way of their mental model of the world – one in which that the Iraqi authorities would invariably lie and operate surreptitiously, regardless of evidence to the contrary.

Likewise, whatever the WHO likes to think it announced about the outbreak of H1N1 influenza in 2009 – ignoring, presumably, the fact that the director-general herself described it as ‘a threat to the whole of humanity’ – its officials should also have been sensitive to the reality that their messages would emerge into a world that had steadily been preparing itself for a devastating health emergency for quite some time.

Indeed, much of this ‘pandemic preparedness’ had been instigated and driven by the WHO itself. It is quite wrong therefore for the IHR Review Committee report to argue that any criticism of the WHO was based on ‘what was later learnt’. It is clear that the global public-health culture that the WHO itself helped to create in advance would inevitably result in just such an over-reaction. It is even possible to go further than this and to predict right now that this will not be an isolated incident. Lessons may be learnt, but mostly the wrong ones.

A critical article in Europe’s largest circulation weekly magazine, Der Spiegel, published just over a year ago, noted how prior to the advent of H1N1 in 2009, ‘epidemiologists, the media, doctors and the pharmaceutical lobby have systematically attuned the world to grim catastrophic scenarios and the dangers of new, menacing infectious diseases’. Indeed, it seemed at the time of the outbreak, to one leading epidemiologist at least, that ‘there is a whole industry just waiting for a pandemic to occur’.

In this, as the IHR Review Committee report makes clear, ‘The main ethos of public health is one of prevention’, before continuing: ‘It is incumbent upon political leaders and policy-makers to understand this core value of public health and how it pervades thinking in the field.’ The authors appear to believe that this is a radical outlook; in fact, this precautionary attitude is the dominant outlook of our times. In that regard at least, the WHO and others were merely doing what came naturally to them when they acted as they did in 2009.

It is the case today that both elites and radicals view the world in near-permanent catastrophist terms. This apocalyptic outlook emerged as a consequence of the broader loss of purpose and direction that affected society in the aftermath of the old Cold War world order that last provided all sides of the political spectrum with some kind of organising rationale.

Indeed, it was as the Cold War was drawing to a close that the concept of emerging and re-emerging infectious diseases first took hold. And, as noted by the American academic Philip Alcabes in an excellent book on these issues, it was also the point at which the notion of dramatic flu epidemics occurring on a cyclical basis – which until the 1970s had been little more than one of many possible theories – also came to form an essential component of the contemporary imagination.

In the autumn of 2001, the anthrax incidents that affected a tiny number of people in the US in the aftermath of the devastating 9/11 terrorist attacks, were heralded as a warning of things to come by the authorities. As a consequence, after many years of being regarded as an unglamorous section of the medical profession, public health was catapulted centre-stage with vast sums made available to it by military and civilian authorities to pre-empt and prevent any bioterrorist attacks that they now all too readily anticipate.

The outbreak of a novel virus, severe acute respiratory syndrome (SARS), in 2003 – a disease that affected few individuals worldwide but had a relatively high fatality rate – was held by many to confirm that we should always prepare for the worst.

Since then it has been the projected threat of H5N1 ‘avian flu’ jumping across the animal-human barrier that has preoccupied the world public-health authorities. Irrespective of the fact that there have been just 553 cases of H5N1 since 2003, concerns generated by it have been sufficient to push through far-reaching transformations to the world public-health order – including the advent of the IHR themselves.

Now – ominously – aside from deflecting any responsibility for the confusions they helped to create, by describing the H1N1 episode as having exposed ‘difficulties in decision-making under conditions of uncertainty’, the IHR Review Committee note in conclusion that – looking forwards – their most important shortcoming is that they ‘lack enforceable sanctions’.

In this regard, public health will not just be perceived of as being a national security concern – as it has already become in many influential circles – but also one requiring effective policing, possibly with its own enforcement agency, through the establishment of a ‘global, public-health reserve workforce’, as the report suggests.

Aside from absolving the IHR and the WHO of any responsibility for the debacle that saw large numbers of well-informed healthcare workers refusing to be inoculated when the vaccine eventually emerged in 2009 – thereby encouraging the public to act in similar fashion – the report of the Review Committee is also a call to make risk communication more of a priority in the future.

But, far from the public requiring the authorities to speak more slowly, more clearly or more loudly to them, it was precisely the attempted communication of risk – where there was little – that was the problem in the first place. That is why we can be sure that this problem is set to recur, at tremendous cost – both social and economic – to society.

Risk is not simply an objective fact, as some seem to suppose. Rather, it is shaped and mediated through the prism of contemporary culture. That we perceive something to be a risk and prioritise it as such, as well as how we respond to it, are socially mediated elements. These may be informed by scientific evidence but, as indicated above in relation to Iraq, broader trends and outlooks often come to dominate the process.

These are impacted upon by a vast number of social, cultural and political variables, such as the cumulative impact on our imagination of books, television programmes and films that project dystopian – or positive – visions of the present and the future. Another major influence is the perception of whether the authorities have exaggerated or underestimated other problems, even such apparently unrelated matters as climate change or the 2008 financial crisis.

An emergency then – whether it relates to health or otherwise – does not simply concern the events, actions and communications of that moment. Rather, it draws together, in concentrated form, the legacies of past events, actions and communications as well. And while it may not have been in the gift of the IHR Review Committee to analyse, and – still less – to act upon all of these, there is precious little evidence that they considered such dynamics – and their own role within them – at all.

Far from struggling to convey their messages about H1N1 through a cacophony of competing voices – as some within the WHO seem to suppose – the authorities concerned totally dominated the information provided about the pandemic in its early stages. Their mistake is to presume that it was merely accurate information and the effective dissemination of it that was lacking.

Rather, it was the interpretation of this information according to previously determined frameworks that had evolved over a protracted period that came to matter most. Accordingly, the WHO tied itself in knots issuing endless advisories at the behest of the various nervous national authorities it had helped to create. This even included guidance on the use of facemasks which, whilst noting a lack of any evidence for the efficacy of these, nevertheless conceded that they could be used, but if so that they should be worn and disposed of carefully!

At the onset of the 1968 ‘Hong Kong’ flu epidemic, that killed many tens of thousands more than H1N1, the then UK chief medical officer postulated – erroneously – that he did not envisage the outbreak being a major problem. Far from being lambasted for being wrong, or hounded out of office, as he might be in today’s febrile culture, it appears that the presumption of the times was that it was precisely the role of those in authority to reassure and calm people down, rather than to issue endless, pointless warnings as we witness today.

The WHO, on the other hand, seems determined to assert its moral authority by projecting its worst fears into the public domain. Sadly, it seems, the authorities have not learnt a single lesson from this episode.

It is not the actions of the individuals concerned that the IHR Review Committee report should have scrutinised and sought to exonerate from presumptions of impropriety or personal gain, but rather the gradual construction of a doom-laden social narrative that WHO officials have both helped to construct and now need to respond to, that urgently needs to be interrogated.

First published on spiked, 23 May 2011

The West’s very own celeb terrorist

Whether he was droning on about climate change or consumption, OBL’s ‘ideas’ were born and bred in the West.

Soon after the death of Osama bin Laden had been announced to the world, 72-year-old Muslim cleric Abu Bakar Bashir – the purported spiritual leader of the Islamist militant group Jemaah Islamiyah – issued a statement from his jail cell in Indonesia, where he faces trial for allegedly funding and organising terrorist camps. The statement, to the effect that ‘Osama’s death will not make al-Qaeda dead’, was designed to instill a sense of foreboding across south-east Asia.

But like all nobodies who hide their own uncertainties and weaknesses behind the words and deeds of supposed somebodies – in this case, behind the dread of al-Qaeda – Bashir simultaneously revealed his own lack of substance. This was apt, because bin Laden himself was always fond of citing Western commentators, academics and diplomats in seeking to legitimise his ostensible cause.

Sounding like any other contemporary critic of American policy, bin Laden droned on about a rag-bag of causes at different times: he lambasted the US for not signing up to the Kyoto treaty to control greenhouse gases; accused Washington of being controlled by a Jewish lobby; suddenly became concerned about Palestine after 9/11; suggested that the wars in Afghanistan and Iraq were simply money-making ventures for large US corporations; and even had the gall – for one in thrall to the Taliban – to argue that Western advertising exploited women.

In this regard, bin Laden revealed his true nature through his statements – including his annual post-9/11 rants that became as boring and predictable as the British queen’s Christmas message. He was entirely parasitical on what was being said about him and about the state of world affairs in the West. After the Madrid bombings of 2004, he even proposed that Western leaders should pay more attention to surveys that revealed how few people supported the war in Iraq.

But what kind of spiritual leader is it who piggy-backs on Western opinion-poll data and the views of environmentalists to get his point across? Why did he advocate reading Robert Fisk and Noam Chomsky, rather than the Koran? In truth, bin Laden was entirely lacking in any substantial ideas of his own, let alone anything that could amount to an ideology. More media-has-been than mujahideen after his escape from US forces in late 2001, bin Laden was the leader of nothing who became the quintessential celebrity terrorist of our times – unable even to control his own fans, never mind control the course of history.

Sadly, those who opposed him were just as devoid of principles of their own. Accordingly, across the political spectrum and in all countries, political leaders and officials who themselves lacked purpose and direction sought to justify their increasingly illiberal policies and actions on the basis of the need to defeat al-Qaeda. Bashir’s recent words of warning sound true because much the same point was made by President Obama in his address to the nation, as well as being echoed by the head of the CIA, the UK prime minister David Cameron, and countless others.

Without al-Qaeda, the global counterterrorism industry would find itself in a real quandary. Little wonder that there is such enthusiasm to reiterate the danger from radical Islam now. The fact that the recent transformations in the Middle East – heralded by some as an ‘Arab spring’ – made little to no reference to either Palestine, or bin Laden and al-Qaeda, makes not a jot of difference to the insights of the self-styled experts.

Far from representing the views and grievances of those in the East and South – whom he never consulted – bin Laden was always a product of the West. He jumped on every bandwagon like some demented blogger and echoed the Western self-loathing he found there. His words would then be picked up again by both followers and critics who lacked the courage to speak out for themselves but preferred instead to point to bin Laden’s empty threats as evidence of what Muslim frustrations and humiliations might lead to.

Instead of a clash of civilisations we had a war of gestures as every controversy in the West about cartoons, books – and now even celebrations – that might be deemed as offensive, were picked up on as further examples of the supposed victimisation of Muslims. This over-sensitivity to images and words only further exacerbated the situation, as whole populations were taught that they must never put up with being offended.

Many commentators, aside from implicitly supporting al-Qaeda’s cause by giving a nod to the simplistic notion that suffering, anger and resentment inevitably leads to terrorism, have also noted more critically how the group came to kill more Muslims than Americans through its actions. But this criticism suggests that if the figures had been skewed the other way, if fewer Muslims had been killed, then these commentators would have been somewhat more understanding towards bin Laden.

The solution frequently put forward to resolve matters has been to create de-radicalisation programmes. However, given that the clerics involved in such programmes share the same misgivings about the modern world as the people they’re supposed to be saving, one wonders if these initiatives could ever possibly be truly successful.

Most notable is the general presumption that the removal of bin Laden will somehow lead to a greater risk in the immediate future through the possibility of reprisal attacks that could occur against anyone, anywhere and at any time. This model is itself a construct of the contemporary culture of fear that exists in the West today, presuming that as one threat goes away, another steps in to fill the void.

Those who argue this way fail to note that while there may be aggrieved individuals at large, these people rarely target the symbols of imperial or racial oppression that are held to drive them. Rather, by lashing out at all manner of symbols of modernity – tall buildings, aeroplanes, shopping malls, night clubs – they reveal their frustrations to be a quite mainstream rejection of Western materialism, and not the religiously inspired attacks that so many commentators presume.

First published on spiked, 5 May 2011

Fukushima: sounding worse, getting better

Obsessed with the idea of a nuclear meltdown, the doom mongerers are blind to the reality at Fukushima.

Over the weekend, much of the world’s media reported a radiation spike emanating from Japan’s stricken Fukushima nuclear power plant of the order of 10million times above the norm. It soon transpired that this figure was erroneous and it has since been retracted by the Japanese authorities. But why did so many seem so keen to report the alarming estimate?

The closer the situation comes to being resolved at Fukushima, the clearer it will become what actually happened there. Hence it will sound like matters are getting worse just as they are getting better. As things stand it would seem that one of the worst earthquakes ever recorded, followed by a devastating tsunami that took out the back-up generators required to cool the nuclear facility, may have caused a minor fissure to the casing of one of six reactors, leading to some radioactive materials being released into the environment.

It is important to maintain a sense of proportion and perspective about this. The quantities released, while alarmingly headlined as raising radiation levels in nearby seawater to 1,250 times the normal safety limit, still amounts to less than one per cent of that which was released over the course of the worst nuclear accident in history at Chernobyl in the former Soviet Union in 1986.

There are two things worth noting from the outset. Firstly, that 1,250 times the normal safety level still amounts to not very much at all. And secondly, contrary to the popular myths about Chernobyl, it is today a visitor destination, albeit for what the trade identifies as extreme tourism. The three remaining reactors at Chernobyl reopened just seven months after the explosion there, with one of the reactors working right through to December 2000, since when a small army of workers has been on-site and steadily decommissioning the plant – a process that could still take many years.

Alarmist figures as to the number of people affected by the Chernobyl disaster bear no resemblance to the actual data confirmed by the Chernobyl Forum – a group that includes the UN, the IAEA and WHO – in its 2006 report on the matter. Only 50 deaths can be directly attributed to the accident. These occurred among those workers brave enough to return to the plant when it was burning to sort out the mess at the time, and among a small number of children in the wider community who developed thyroid cancer.

Those who suggest that thousands, maybe even tens of thousands, of fatal cancers are linked to the Chernobyl disaster are basing these estimates on extrapolations from the effects of the atomic bombs dropped on Japan in 1945. These estimates are derived using a linear extrapolation from the effects of high levels of radiation received in an instant as the bombs exploded. But most researchers recognise that the circumstances in Hiroshima and Nagasaki were very different to those in Chernobyl. Such estimates are, therefore, based on rather shaky evidence. It is like suggesting that because a temperature of 200 degrees Celsius would kill 100 per cent of human beings, so a temperature of 20 degrees Celsius should kill 10 per cent of them. In reality, our bodies are able to tolerate radiation up to a certain threshold. Low levels of radiation are almost certainly harmless.

This brings us back to the contaminated seawater, as well as the affected food items and drinking water in Japan today. The situation is certainly not ideal and no doubt lessons will be learnt from all this as they always are after every emergency. Indeed, whether we appreciate it or not, it is only by taking risks that society has evolved to the form in which it exists today, whereby we live longer and healthier lives than any preceding generation. Sadly, it is only by making mistakes that we learn the real limits of anything. And as some have also indicated, even the worst levels of radiation reported from Japan – aside from those to which a handful of workers have been exposed – amount to little compared to natural background levels in other places on earth, as well as comparing favourably with other exposures we voluntarily subject ourselves to, whether these be through flying or having an X-ray or a CT scan.

The situation now is likely to require a plentiful supply of energy to resolve – energy which, like it or not, will probably come from other nuclear facilities, not from windmills and solar panels. These renewable technologies, while they may be desirable for the future, will only emerge based on what we have available to us in the here and now.

The anti-nuclear campaigners however – alongside the far bigger army of catastrophists, who seem keen to imagine the worst at every opportunity – are now smugly standing by to say ‘I told you so’. But none of them suggested there would be a tiny crack through which a limited amount of radiation may leak. Rather, there was a cacophony of voices projecting a meltdown and Armageddon. And, as none of these commentators were nuclear engineers who attended the site in Japan itself, it is obvious that all they could do was imagine the worst and project that fantasy into the public domain.

It would be preferable to have a few more trained specialists dealing with the actual emergency. From a sociological perspective, however, one focused particularly on risks and how these are perceived and communicated, it was entirely predictable that an assortment of risk entrepreneurs, doom-mongers and assorted lobbyists would clamour to claim this incident for themselves and attach it to whatever fear-laden view they hold.

Eight years ago, as hostilities resumed in Iraq, there were many determined to uncover Saddam Hussein’s supposed stash of weapons of mass destruction there, despite the evidence consistently pointing to their absence. We were advised instead to focus on the unknown, or the ‘unknown unknowns’ as the US defence secretary Donald Rumsfeld famously put it. Two years ago, once the director-general of WHO had identified H1N1 as ‘a threat to the whole of humanity’, nations everywhere cranked into pandemic prevention overdrive, convinced that only their precautionary actions could save humanity – this despite all the evidence pointing towards the outbreak of a mild version of influenza. We have to recognise that once a particular mindset is established it is very hard for people to accept that their model of the world may not be correct even if the facts are staring them in the face.

This is the pattern being repeated around the nuclear incident in Japan. Some newscasters seem determined to convey the worst that could happen, as if this were some public service. But surely at such times the role of the media is to report the facts rather than imagine a Hollywood script? The problem we now confront is that a significant number of cultural pessimists have staked their reputations on proving that there was a major problem and possibly that this was covered up. Such individuals seem to desire – if not need – the worst, to confirm their apocalyptic frameworks. It is high time we focused on the evidence and let those who are actually capable of dealing with the mess at Fukushima get on with their jobs without having to worry that their every step will be projected on to the world stage as an example of incompetence and conspiracy.

First published on spiked, 29 March 2011

The mad post-tsunami food panic

You could eat Japan’s so-called ‘radioactive spinach’ for a whole year and it still wouldn’t cause you much harm.

It would require an iron will to stand in the face of today’s febrile culture and oppose the wave of countries rapidly withdrawing Japanese foodstuffs from their shelves ‘in line with the precautionary approach’, as a Singapore government spokesperson put it.

Having alerted the world to elevated levels of radiation in food items such as spinach and milk, as well as doses twice the recommended limit for babies in drinking water in Tokyo, the Japanese government really has no one other than itself to blame. After coping admirably in managing the immediate aftermath of the earthquake and the tsunami, as well as demonstrating the resolve to address the situation at the Fukushima nuclear power plant, it seems that it is at the level of communication that the authorities may yet score an own-goal.

The Japanese cabinet secretary, Yukio Edano – until now the image of cool with his detached demeanour and worker’s overalls at press conferences – has asked international importers to take a ‘logical stance’ over the food situation. They will. Unfortunately, it is not the logic he may have had in mind. ‘Even if these foods are temporarily eaten, there is no health hazard’, he advised. Others have indicated that one would have to drink a lot of the water before being harmed. Drinking the water in Tokyo for a year might expose you to an additional 0.8 millisieverts (mSv) of radiation. But then living in some of the places on earth where the natural background radiation is above the norm could easily expose you to 10 times as much.

Needless to say, people continue to live in such areas – and have babies. In fact, there is a considerable body of evidence to suggest that – if anything – their longevities may be enhanced through such exposure. After all, biological life emerged into an environment that had far more radiation, from the ground and from space, than it does today.

Eating the spinach non-stop for a year (perish the thought) would give you a radiation dose equivalent to about one CT scan. Drinking the milk endlessly would be even less of a problem. In fact, you would be sick of eating and drinking these products long before any of them could make you sick from radiation poisoning or cancer.

So where did it all go wrong for Edano? Where did the army of over-zealous officials wanting to ban things on a precautionary basis come from? Should we blame the US – we often do – for starting the cascade? Or was it the media who irresponsibly amplified concerns?

In fact, if we truly hope to understand the confusions now emerging over the situation regarding food from Japan, there is little point in looking there, or even trying to understand nuclear accidents and radiation, or the role of today’s nervous officials and the media.

Rather, since the end of the Cold War in 1989, the world has steadily been reorganised along the principle that it is better to be safe than sorry. That sounds eminently sensible. But is it true? Is there not a point where safety comes at a cost to other areas of life? For instance, if we were to put all our resources into combating terrorism, there would be none left to deal with disease.

Risk management is always about such trade-offs. But the mantra that we should be as safe as possible and always take precautionary measures whenever possible has become good coin among bureaucratic elites the world over. This provided governments with a new role once the old Soviet enemy had imploded. Noting too that the end of the old-style confrontational politics had also left people rather more isolated and insecure, politicians refashioned themselves as the people’s protectors of last resort.

This has come at a tremendous cost to society – leaders driven more by events than by principles, and populations that are used to having their prejudices pandered to rather than challenged. The rot, of course, started at the top. Hence witness a large number of foreign nationals in Japan, many of whom were caught up in these tumultuous events, and who wanted to stay behind to help their friends and loved ones. They even wanted to help complete strangers – but of course we now know, because we have been brought up to believe so, that strangers are a danger anyway.

So, rather than pursuing their humane instincts, according to their own assessment of what the real risks were, many such individuals were advised, by their own national governments, to get out. Get out of the region. Get out of Tokyo. Get out of Japan.

In the past, people who ran away from people in need, particularly when these were people they knew, might have been accused of being cowards. Today, we call that ‘precautionary measures’.

Welcome to the brave new world of risk-obsessed politics. Far from building character and making populations more resilient, as the leaders of some of these countries constantly profess themselves to be doing, what we find is a highly confused culture that encourages a febrile response, both on the ground, and many thousands of miles away.

It is this that might prove to be the greatest problem for the wider Japanese population for quite some time to come.

First published on spiked, 24 March 2011

Human security – a retrospective

Introduction

This paper is not a comprehensive critique of the concept of human security, for which the reader should look to some of the authors cited here and those whom they cite. Rather, it is based on remarks prepared as a discussant at a human security workshop held at La Trobe University in Melbourne, Australia on 8 June 2010.

The concept of human security has come of age. Many writers are today examining how far the concept has come, where from, and in which directions it should now be heading.

Like an artist whose work is about to undergo its first major retrospective, this shows how entirely mainstream this framework has become today. It is embraced by many across the entire political spectrum, as well as by activists from civil society organizations.

Human security certainly has traction. After all, who could possibly oppose the notions of ‘freedom from fear and freedom from want for everybody? This formulation, originally attributed to Franklin D. Roosevelt, is the one embedded in the 1994 United Nations Development Program (UNDP) Human Development Report, authored by the Pakistani economist Mahbub Ul-Haq, that many see as one of the drivers of this agenda.

But maybe the growing popularity of, resonance with, and attempts to implement human security are not based on the reasons that most appear to suppose they are. Maybe it is not simply an inherent good that should be applied across the board of international relations as quickly as possible. In the words of the Canadian academic and policy advisor Andrew Mack, ‘Human Security’s importance lies less in its explanatory powers than as a signifier of shared political and moral values.”

If so, we should be alert to some of the unexpected consequences of this characteristic, particularly if the concept itself is found to be wanting.

Human Security – A Retrospective, Global Change, Peace & Security, Vol. 22, No. 3, pp.385-390, October 2010