Theory informed by practice. Application informed by purpose. Why to understand and manage risk, cultural context is the key

Abstract: Risk analysis and risk management are reliant in order to be effective on their ability to engage with and communicate to non-specialist audiences, whether these be policy-makers asked to turn the advice that they agree with into practice, those implementing decisions, or the public, who are often on the receiving end of these.

Accordingly, there needs to be clarity of purpose regarding – and reflected through – the language used, the partners engaged, and the proposed ends of any measures to be implemented. These elements sit within specific cultural contexts – both geographical and historical – and it is essential to account for these in translating theory into practice.

This article surveys the discourse used across various examples, including a detailed case study. The most significant conclusion is that while data and evidence certainly matter for validation – understanding culture remains key to effective risk analysis and trustworthy risk management because, on the whole, people look for meaning beyond the mere ‘facts’. This applies to risks assumed to be narrowly technical as much as those with a strong social, cultural and political dimension.

Few risk analysts and safety experts consider or account for the broader, contextual and cultural factors that impact their choices, analyses and modes of dissemination. This creates a divide between those commissioning and conducting the research and those to whom it is held to apply and needs to be implemented by, which undermines democratic accountability, as well as the possible benefits of, and trust in, their enterprise.

Durodie, W. (2017). Theory informed by practice. Application informed by purpose. Why to understand and manage risk, cultural context is the key. Safety Science99(B), 244-254. https://doi.org/10.1016/j.ssci.2017.04.002

Precautionary Tales – Missing the Problem and its Cause

Extract: Two recently published volumes on the concept of precaution as it is variously understood and applied across the United States and in Europe make for a fascinating comparative analysis. They also respectively offer some undoubted and invaluable insights into the subject. Sadly neither really addresses how precaution came of age or why.

Precautionary Tales: Missing the Problem and the Cause, European Journal of Risk Regulation, Vol.4, No.2, pp.297-299, June 2013

H1N1 – The Social Costs of Cultural Confusion

Abstract: In May 2011, the World Health Assembly received the report of its International Health Regulations Review Committee examining responses to the outbreak of the 2009 H1N1 pandemic influenza and identifying lessons to be learnt. This will emphasized the need for better risk communication in the future. But risk and communication are not objective facts; they are socially mediated cultural products. Responses to crises are not simply determined by the situation at hand, but also mental models developed over protracted periods. Accordingly, those forces responsible for promoting the precautionary approach and encouraging the securitization of health, that both helped encourage a catastrophist outlook in this instance, are unlikely to be held to scrutiny. These cultural confusions have come at an enormous cost to society.

H1N1 – The Social Costs of Cultural Confusion, Global Health Governance, Vol. 4, No.2, pp.1-19, June 2011

WHO’s learned nothing from the swine-flu panic?

The over-reaction to H1N1 influenza in 2009 was built on years of waiting for ‘the Big One’.

Over the past few days, the sixty-fourth session of the World Health Assembly (WHA) has been held in Geneva. The WHA is the highest decision-making body of the World Health Organization (WHO). It is comprised of delegations up to ministerial level from the WHO’s 193 constituent member states.

Among the agenda items was to be a discussion of the International Health Regulations 2005 (IHR) in relation to pandemic A (H1N1) 2009 – colloquially known at the time as ‘swine flu’. The IHR first came into force in 2007 and were established to facilitate international cooperation in preventing and responding to acute public-health emergencies, such as the outbreak of influenza that appeared to originate in Mexico two years ago.

The 180-page report, presented by the IHR Review Committee to the WHA, certainly seems impressive. Aside from receiving numerous national and institutional inputs, well over a hundred individuals from a vast array of agencies worldwide, including the WHO, contributed in some form to its findings.

But, in essence, only one point of any note is made in it: ‘Critics assert that WHO vastly overstated the seriousness of the pandemic. However, reasonable criticism can be based only on what was known at the time and not what was later learnt.’ This is felt to be of such significance that it is stated three times – in the executive summary, in a slightly modified form in the body of the text, and again in the conclusions. It is intended as a robust rebuttal to those voices – in the media, the medical professions, and elsewhere – who have questioned the global response to H1N1, and the WHO’s role in shaping this response.

Foremost among these has been Paul Flynn, a British Labour MP and rapporteur to the Social, Health and Family Affairs Committee of the Council of Europe, through which he successfully promoted an inquiry into the matter. This inquiry primarily questioned the role of advisors to the WHO, who – through being employed by large pharmaceutical companies that produce anti-viral drugs and vaccines – were held to have had an economic motivation in raising public concerns about swine flu.

The editor of the British Medical Journal, Fiona Godlee, and others, have similarly pointed to possible conflicts of interests, as well as a lack of transparency within the WHO relating to advice and appointments. Sam Everington, former deputy chair of the British Medical Association, went on the record to argue that, in his opinion, the UK’s chief medical officer and the government were ‘actively scaremongering’.

Quite a number of countries worldwide have also raised criticisms since the pandemic abated, ruing the fact that they purchased vast stocks of vaccines at some considerable cost that have remained unused.

But, just as with the official review of the UK’s response into the outbreak, these voices and views are simply non-existent as far as the IHR Review Committee and the WHO are concerned. And anyway, as the report repeatedly reiterates, it is the considered opinion of international public-health specialists that claims of over-reaction to what turned out to be a comparatively mild illness are misguided. Those who point to this are held to be cavalier and complacent as to the possible risks entailed should the situation have been different.

What’s more, much emphasis is placed in the report on the fact that Margaret Chan, the director-general of the WHO, and other WHO staff consistently tried to calm matters down, repeatedly noting that the overwhelming majority of cases were mild and recommending to governments that there was no need to restrict travel or trade. If anyone went beyond the measures that were officially advocated then the WHO could hardly be held responsible for this, the report contends. Hence it is to the media, and in particular new social media, that blame is attached.

But all this is to woefully misunderstand and underestimate how communication about risk affects contemporary society. Regulations and warnings are not issued into a vacuum. People and institutions do not merely respond to messages on the basis of the precise information contained within them. Rather they interpret these through the prism of their pre-existing cultural frameworks.

For example, when the former UN weapons inspector Hans Blix advised the world in 2002 that he could find no evidence for weapons of mass destruction in Iraq, it is quite clear that, rather than reading this at face value, the response of the US authorities was to assume that any such weapons were simply well hidden. In other words, they did not allow the facts to stand in the way of their mental model of the world – one in which that the Iraqi authorities would invariably lie and operate surreptitiously, regardless of evidence to the contrary.

Likewise, whatever the WHO likes to think it announced about the outbreak of H1N1 influenza in 2009 – ignoring, presumably, the fact that the director-general herself described it as ‘a threat to the whole of humanity’ – its officials should also have been sensitive to the reality that their messages would emerge into a world that had steadily been preparing itself for a devastating health emergency for quite some time.

Indeed, much of this ‘pandemic preparedness’ had been instigated and driven by the WHO itself. It is quite wrong therefore for the IHR Review Committee report to argue that any criticism of the WHO was based on ‘what was later learnt’. It is clear that the global public-health culture that the WHO itself helped to create in advance would inevitably result in just such an over-reaction. It is even possible to go further than this and to predict right now that this will not be an isolated incident. Lessons may be learnt, but mostly the wrong ones.

A critical article in Europe’s largest circulation weekly magazine, Der Spiegel, published just over a year ago, noted how prior to the advent of H1N1 in 2009, ‘epidemiologists, the media, doctors and the pharmaceutical lobby have systematically attuned the world to grim catastrophic scenarios and the dangers of new, menacing infectious diseases’. Indeed, it seemed at the time of the outbreak, to one leading epidemiologist at least, that ‘there is a whole industry just waiting for a pandemic to occur’.

In this, as the IHR Review Committee report makes clear, ‘The main ethos of public health is one of prevention’, before continuing: ‘It is incumbent upon political leaders and policy-makers to understand this core value of public health and how it pervades thinking in the field.’ The authors appear to believe that this is a radical outlook; in fact, this precautionary attitude is the dominant outlook of our times. In that regard at least, the WHO and others were merely doing what came naturally to them when they acted as they did in 2009.

It is the case today that both elites and radicals view the world in near-permanent catastrophist terms. This apocalyptic outlook emerged as a consequence of the broader loss of purpose and direction that affected society in the aftermath of the old Cold War world order that last provided all sides of the political spectrum with some kind of organising rationale.

Indeed, it was as the Cold War was drawing to a close that the concept of emerging and re-emerging infectious diseases first took hold. And, as noted by the American academic Philip Alcabes in an excellent book on these issues, it was also the point at which the notion of dramatic flu epidemics occurring on a cyclical basis – which until the 1970s had been little more than one of many possible theories – also came to form an essential component of the contemporary imagination.

In the autumn of 2001, the anthrax incidents that affected a tiny number of people in the US in the aftermath of the devastating 9/11 terrorist attacks, were heralded as a warning of things to come by the authorities. As a consequence, after many years of being regarded as an unglamorous section of the medical profession, public health was catapulted centre-stage with vast sums made available to it by military and civilian authorities to pre-empt and prevent any bioterrorist attacks that they now all too readily anticipate.

The outbreak of a novel virus, severe acute respiratory syndrome (SARS), in 2003 – a disease that affected few individuals worldwide but had a relatively high fatality rate – was held by many to confirm that we should always prepare for the worst.

Since then it has been the projected threat of H5N1 ‘avian flu’ jumping across the animal-human barrier that has preoccupied the world public-health authorities. Irrespective of the fact that there have been just 553 cases of H5N1 since 2003, concerns generated by it have been sufficient to push through far-reaching transformations to the world public-health order – including the advent of the IHR themselves.

Now – ominously – aside from deflecting any responsibility for the confusions they helped to create, by describing the H1N1 episode as having exposed ‘difficulties in decision-making under conditions of uncertainty’, the IHR Review Committee note in conclusion that – looking forwards – their most important shortcoming is that they ‘lack enforceable sanctions’.

In this regard, public health will not just be perceived of as being a national security concern – as it has already become in many influential circles – but also one requiring effective policing, possibly with its own enforcement agency, through the establishment of a ‘global, public-health reserve workforce’, as the report suggests.

Aside from absolving the IHR and the WHO of any responsibility for the debacle that saw large numbers of well-informed healthcare workers refusing to be inoculated when the vaccine eventually emerged in 2009 – thereby encouraging the public to act in similar fashion – the report of the Review Committee is also a call to make risk communication more of a priority in the future.

But, far from the public requiring the authorities to speak more slowly, more clearly or more loudly to them, it was precisely the attempted communication of risk – where there was little – that was the problem in the first place. That is why we can be sure that this problem is set to recur, at tremendous cost – both social and economic – to society.

Risk is not simply an objective fact, as some seem to suppose. Rather, it is shaped and mediated through the prism of contemporary culture. That we perceive something to be a risk and prioritise it as such, as well as how we respond to it, are socially mediated elements. These may be informed by scientific evidence but, as indicated above in relation to Iraq, broader trends and outlooks often come to dominate the process.

These are impacted upon by a vast number of social, cultural and political variables, such as the cumulative impact on our imagination of books, television programmes and films that project dystopian – or positive – visions of the present and the future. Another major influence is the perception of whether the authorities have exaggerated or underestimated other problems, even such apparently unrelated matters as climate change or the 2008 financial crisis.

An emergency then – whether it relates to health or otherwise – does not simply concern the events, actions and communications of that moment. Rather, it draws together, in concentrated form, the legacies of past events, actions and communications as well. And while it may not have been in the gift of the IHR Review Committee to analyse, and – still less – to act upon all of these, there is precious little evidence that they considered such dynamics – and their own role within them – at all.

Far from struggling to convey their messages about H1N1 through a cacophony of competing voices – as some within the WHO seem to suppose – the authorities concerned totally dominated the information provided about the pandemic in its early stages. Their mistake is to presume that it was merely accurate information and the effective dissemination of it that was lacking.

Rather, it was the interpretation of this information according to previously determined frameworks that had evolved over a protracted period that came to matter most. Accordingly, the WHO tied itself in knots issuing endless advisories at the behest of the various nervous national authorities it had helped to create. This even included guidance on the use of facemasks which, whilst noting a lack of any evidence for the efficacy of these, nevertheless conceded that they could be used, but if so that they should be worn and disposed of carefully!

At the onset of the 1968 ‘Hong Kong’ flu epidemic, that killed many tens of thousands more than H1N1, the then UK chief medical officer postulated – erroneously – that he did not envisage the outbreak being a major problem. Far from being lambasted for being wrong, or hounded out of office, as he might be in today’s febrile culture, it appears that the presumption of the times was that it was precisely the role of those in authority to reassure and calm people down, rather than to issue endless, pointless warnings as we witness today.

The WHO, on the other hand, seems determined to assert its moral authority by projecting its worst fears into the public domain. Sadly, it seems, the authorities have not learnt a single lesson from this episode.

It is not the actions of the individuals concerned that the IHR Review Committee report should have scrutinised and sought to exonerate from presumptions of impropriety or personal gain, but rather the gradual construction of a doom-laden social narrative that WHO officials have both helped to construct and now need to respond to, that urgently needs to be interrogated.

First published on spiked, 23 May 2011

Fukushima: sounding worse, getting better

Obsessed with the idea of a nuclear meltdown, the doom mongerers are blind to the reality at Fukushima.

Over the weekend, much of the world’s media reported a radiation spike emanating from Japan’s stricken Fukushima nuclear power plant of the order of 10million times above the norm. It soon transpired that this figure was erroneous and it has since been retracted by the Japanese authorities. But why did so many seem so keen to report the alarming estimate?

The closer the situation comes to being resolved at Fukushima, the clearer it will become what actually happened there. Hence it will sound like matters are getting worse just as they are getting better. As things stand it would seem that one of the worst earthquakes ever recorded, followed by a devastating tsunami that took out the back-up generators required to cool the nuclear facility, may have caused a minor fissure to the casing of one of six reactors, leading to some radioactive materials being released into the environment.

It is important to maintain a sense of proportion and perspective about this. The quantities released, while alarmingly headlined as raising radiation levels in nearby seawater to 1,250 times the normal safety limit, still amounts to less than one per cent of that which was released over the course of the worst nuclear accident in history at Chernobyl in the former Soviet Union in 1986.

There are two things worth noting from the outset. Firstly, that 1,250 times the normal safety level still amounts to not very much at all. And secondly, contrary to the popular myths about Chernobyl, it is today a visitor destination, albeit for what the trade identifies as extreme tourism. The three remaining reactors at Chernobyl reopened just seven months after the explosion there, with one of the reactors working right through to December 2000, since when a small army of workers has been on-site and steadily decommissioning the plant – a process that could still take many years.

Alarmist figures as to the number of people affected by the Chernobyl disaster bear no resemblance to the actual data confirmed by the Chernobyl Forum – a group that includes the UN, the IAEA and WHO – in its 2006 report on the matter. Only 50 deaths can be directly attributed to the accident. These occurred among those workers brave enough to return to the plant when it was burning to sort out the mess at the time, and among a small number of children in the wider community who developed thyroid cancer.

Those who suggest that thousands, maybe even tens of thousands, of fatal cancers are linked to the Chernobyl disaster are basing these estimates on extrapolations from the effects of the atomic bombs dropped on Japan in 1945. These estimates are derived using a linear extrapolation from the effects of high levels of radiation received in an instant as the bombs exploded. But most researchers recognise that the circumstances in Hiroshima and Nagasaki were very different to those in Chernobyl. Such estimates are, therefore, based on rather shaky evidence. It is like suggesting that because a temperature of 200 degrees Celsius would kill 100 per cent of human beings, so a temperature of 20 degrees Celsius should kill 10 per cent of them. In reality, our bodies are able to tolerate radiation up to a certain threshold. Low levels of radiation are almost certainly harmless.

This brings us back to the contaminated seawater, as well as the affected food items and drinking water in Japan today. The situation is certainly not ideal and no doubt lessons will be learnt from all this as they always are after every emergency. Indeed, whether we appreciate it or not, it is only by taking risks that society has evolved to the form in which it exists today, whereby we live longer and healthier lives than any preceding generation. Sadly, it is only by making mistakes that we learn the real limits of anything. And as some have also indicated, even the worst levels of radiation reported from Japan – aside from those to which a handful of workers have been exposed – amount to little compared to natural background levels in other places on earth, as well as comparing favourably with other exposures we voluntarily subject ourselves to, whether these be through flying or having an X-ray or a CT scan.

The situation now is likely to require a plentiful supply of energy to resolve – energy which, like it or not, will probably come from other nuclear facilities, not from windmills and solar panels. These renewable technologies, while they may be desirable for the future, will only emerge based on what we have available to us in the here and now.

The anti-nuclear campaigners however – alongside the far bigger army of catastrophists, who seem keen to imagine the worst at every opportunity – are now smugly standing by to say ‘I told you so’. But none of them suggested there would be a tiny crack through which a limited amount of radiation may leak. Rather, there was a cacophony of voices projecting a meltdown and Armageddon. And, as none of these commentators were nuclear engineers who attended the site in Japan itself, it is obvious that all they could do was imagine the worst and project that fantasy into the public domain.

It would be preferable to have a few more trained specialists dealing with the actual emergency. From a sociological perspective, however, one focused particularly on risks and how these are perceived and communicated, it was entirely predictable that an assortment of risk entrepreneurs, doom-mongers and assorted lobbyists would clamour to claim this incident for themselves and attach it to whatever fear-laden view they hold.

Eight years ago, as hostilities resumed in Iraq, there were many determined to uncover Saddam Hussein’s supposed stash of weapons of mass destruction there, despite the evidence consistently pointing to their absence. We were advised instead to focus on the unknown, or the ‘unknown unknowns’ as the US defence secretary Donald Rumsfeld famously put it. Two years ago, once the director-general of WHO had identified H1N1 as ‘a threat to the whole of humanity’, nations everywhere cranked into pandemic prevention overdrive, convinced that only their precautionary actions could save humanity – this despite all the evidence pointing towards the outbreak of a mild version of influenza. We have to recognise that once a particular mindset is established it is very hard for people to accept that their model of the world may not be correct even if the facts are staring them in the face.

This is the pattern being repeated around the nuclear incident in Japan. Some newscasters seem determined to convey the worst that could happen, as if this were some public service. But surely at such times the role of the media is to report the facts rather than imagine a Hollywood script? The problem we now confront is that a significant number of cultural pessimists have staked their reputations on proving that there was a major problem and possibly that this was covered up. Such individuals seem to desire – if not need – the worst, to confirm their apocalyptic frameworks. It is high time we focused on the evidence and let those who are actually capable of dealing with the mess at Fukushima get on with their jobs without having to worry that their every step will be projected on to the world stage as an example of incompetence and conspiracy.

First published on spiked, 29 March 2011

The mad post-tsunami food panic

You could eat Japan’s so-called ‘radioactive spinach’ for a whole year and it still wouldn’t cause you much harm.

It would require an iron will to stand in the face of today’s febrile culture and oppose the wave of countries rapidly withdrawing Japanese foodstuffs from their shelves ‘in line with the precautionary approach’, as a Singapore government spokesperson put it.

Having alerted the world to elevated levels of radiation in food items such as spinach and milk, as well as doses twice the recommended limit for babies in drinking water in Tokyo, the Japanese government really has no one other than itself to blame. After coping admirably in managing the immediate aftermath of the earthquake and the tsunami, as well as demonstrating the resolve to address the situation at the Fukushima nuclear power plant, it seems that it is at the level of communication that the authorities may yet score an own-goal.

The Japanese cabinet secretary, Yukio Edano – until now the image of cool with his detached demeanour and worker’s overalls at press conferences – has asked international importers to take a ‘logical stance’ over the food situation. They will. Unfortunately, it is not the logic he may have had in mind. ‘Even if these foods are temporarily eaten, there is no health hazard’, he advised. Others have indicated that one would have to drink a lot of the water before being harmed. Drinking the water in Tokyo for a year might expose you to an additional 0.8 millisieverts (mSv) of radiation. But then living in some of the places on earth where the natural background radiation is above the norm could easily expose you to 10 times as much.

Needless to say, people continue to live in such areas – and have babies. In fact, there is a considerable body of evidence to suggest that – if anything – their longevities may be enhanced through such exposure. After all, biological life emerged into an environment that had far more radiation, from the ground and from space, than it does today.

Eating the spinach non-stop for a year (perish the thought) would give you a radiation dose equivalent to about one CT scan. Drinking the milk endlessly would be even less of a problem. In fact, you would be sick of eating and drinking these products long before any of them could make you sick from radiation poisoning or cancer.

So where did it all go wrong for Edano? Where did the army of over-zealous officials wanting to ban things on a precautionary basis come from? Should we blame the US – we often do – for starting the cascade? Or was it the media who irresponsibly amplified concerns?

In fact, if we truly hope to understand the confusions now emerging over the situation regarding food from Japan, there is little point in looking there, or even trying to understand nuclear accidents and radiation, or the role of today’s nervous officials and the media.

Rather, since the end of the Cold War in 1989, the world has steadily been reorganised along the principle that it is better to be safe than sorry. That sounds eminently sensible. But is it true? Is there not a point where safety comes at a cost to other areas of life? For instance, if we were to put all our resources into combating terrorism, there would be none left to deal with disease.

Risk management is always about such trade-offs. But the mantra that we should be as safe as possible and always take precautionary measures whenever possible has become good coin among bureaucratic elites the world over. This provided governments with a new role once the old Soviet enemy had imploded. Noting too that the end of the old-style confrontational politics had also left people rather more isolated and insecure, politicians refashioned themselves as the people’s protectors of last resort.

This has come at a tremendous cost to society – leaders driven more by events than by principles, and populations that are used to having their prejudices pandered to rather than challenged. The rot, of course, started at the top. Hence witness a large number of foreign nationals in Japan, many of whom were caught up in these tumultuous events, and who wanted to stay behind to help their friends and loved ones. They even wanted to help complete strangers – but of course we now know, because we have been brought up to believe so, that strangers are a danger anyway.

So, rather than pursuing their humane instincts, according to their own assessment of what the real risks were, many such individuals were advised, by their own national governments, to get out. Get out of the region. Get out of Tokyo. Get out of Japan.

In the past, people who ran away from people in need, particularly when these were people they knew, might have been accused of being cowards. Today, we call that ‘precautionary measures’.

Welcome to the brave new world of risk-obsessed politics. Far from building character and making populations more resilient, as the leaders of some of these countries constantly profess themselves to be doing, what we find is a highly confused culture that encourages a febrile response, both on the ground, and many thousands of miles away.

It is this that might prove to be the greatest problem for the wider Japanese population for quite some time to come.

First published on spiked, 24 March 2011

Human security – a retrospective

Introduction

This paper is not a comprehensive critique of the concept of human security, for which the reader should look to some of the authors cited here and those whom they cite. Rather, it is based on remarks prepared as a discussant at a human security workshop held at La Trobe University in Melbourne, Australia on 8 June 2010.

The concept of human security has come of age. Many writers are today examining how far the concept has come, where from, and in which directions it should now be heading.

Like an artist whose work is about to undergo its first major retrospective, this shows how entirely mainstream this framework has become today. It is embraced by many across the entire political spectrum, as well as by activists from civil society organizations.

Human security certainly has traction. After all, who could possibly oppose the notions of ‘freedom from fear and freedom from want for everybody? This formulation, originally attributed to Franklin D. Roosevelt, is the one embedded in the 1994 United Nations Development Program (UNDP) Human Development Report, authored by the Pakistani economist Mahbub Ul-Haq, that many see as one of the drivers of this agenda.

But maybe the growing popularity of, resonance with, and attempts to implement human security are not based on the reasons that most appear to suppose they are. Maybe it is not simply an inherent good that should be applied across the board of international relations as quickly as possible. In the words of the Canadian academic and policy advisor Andrew Mack, ‘Human Security’s importance lies less in its explanatory powers than as a signifier of shared political and moral values.”

If so, we should be alert to some of the unexpected consequences of this characteristic, particularly if the concept itself is found to be wanting.

Human Security – A Retrospective, Global Change, Peace & Security, Vol. 22, No. 3, pp.385-390, October 2010

Risk and the social construction of ‘Gulf War Syndrome’

Abstract: Fifteen years since the events that are held by some to have caused it, Gulf War Syndrome continues to exercise the mind and energies of numerous researchers across the world, as well as those who purport to be its victims and their advocates in the media, law and politics. But it may be that the search for a scientific or medical solution to this issue was misguided in the first place, for Gulf War Syndrome, if there is such an entity, appears to have much in common with other ‘illnesses of modernity’, whose roots are more socially and culturally driven than what doctors would conventionally consider to be diseases. The reasons for this are complex, but derive from our contemporary proclivity to understand humanity as being frail and vulnerable in an age marked by an exaggerated perception of risk and a growing use of the ‘politics of fear’. It is the breakdown of social solidarities across the twentieth century that has facilitated this process.Unfortunately, as this paper explores, our inability to understand the social origins of self-hood and illness, combined with a growing cynicism towards all sources of authority, whether political, scientific, medical or corporate, has produced a powerful demand for blame and retribution deriving from a resolute few who continue to oppose all of the evidence raised against them.Sadly, this analysis suggests that Gulf War Syndrome is likely to prove only one of numerous such instances that are likely to emerge over the coming years.

Risk and the Social Construction of ‘Gulf War Syndrome’, Philosophical Transactions of the Royal Society B (Impact Factor 5.847 ISI 2015), Vol.361, No.1468, pp.689-695, April 2006

A question of fear, not chemistry

‘Many of the concerns about chemicals can best be described as conclusions in search of data.’

On 24 September 2004, the Council of the European Union permanently banned a family of organic chemicals, known as phthalates, from use in toys and childcare items. This ‘political agreement’ finally brought to an end five years of debate about the toxicity of these compounds. During that time, the European Commission maintained a rolling series of temporary emergency bans, despite the scientific research evidence that consistently and increasingly opposed this official view.

Banning phthalates, a family of organic compounds used to soften PVC, appears unexceptional in its own terms. After concerns had been raised as to their possible toxicological impact upon infants, it perhaps seems reasonable to pursue a course of caution and further research. Phthalates’ removal from the marketplace is unlikely to generate much immediate economic pain, even for those companies that produced them.

But concerns about phthalates reflected a growing cautionary climate and helped pave the way for a new European chemicals regulation strategy – REACH (Registration, Evaluation and Authorisation of Chemicals). Now, thousands of chemicals that have been in regular use for over 20 years have to face a battery of toxicological tests, despite our having billions of hours of exposure data as to the consequences of their use.

Again, it may seem sensible to make such testing mandatory. It could surprise some people to find out that chemicals in use prior to 1981 are unlikely to have been subjected to toxicity and carcinogenicity tests. But the tests are unlikely to resolve matters.

The tests have been described as both unfeasible and unnecessary by the UK Medical Research Council Institute for Environment and Health. This is because – as with phthalates before them – the tests are to be performed on a precautionary basis. There is no evidence of any harm. Rather, it is our contemporary culture that demands constant reassurance at any cost that may be the most harmful. REACH will require vast resources, not least in terms of animal testing, but it will be unable to address all possible concerns, and it will drive consumer fears rather than assuaging them for a period in excess of 50 years.

It is important to understand these developments as shaped more by political context than scientific evidence. In the early 1990s, the European Commission and its scientific services went through major physical and cultural reorganisations in the aftermath of the BSE (‘mad cow disease’) debacle. A new cautionary outlook was adopted which effectively advocated pre-emptive strikes in situations of uncertainty. This ‘precautionary principle’ required the use of worst-case scenarios in scientific decision-making.

These developments both reflected and amplified broader trends in society. A proclivity to speculate about what might be, now dominates over examination of what actually is. Caution requires extrapolating from uncorroborated or anecdotal evidence – just in case. This has allowed rumour and myth to abound and increasingly to shape our lives. Hence the growing calls to regulate, not just chemicals, but all manner of other products and activities, both new and old, from conkers to vaccines.

But the real driver behind our growing insecurities has more to do with the political disconnection that now dominates contemporary life. As ordinary people no longer form part of active networks as they did in the past, so their tolerance and trust in all forms of authority, whether political, corporate or scientific, has waned. Subjective impressions of reality go unmoderated and grow into all-consuming worldviews not open to reasoned interrogation.

This process has been facilitated by the political, corporate and scientific elites who, lacking any vision or direction of their own, have willingly repackaged themselves as societal risk-managers. Sensing their growing isolation from those they depend upon for authority, leaders now offer to protect us from our fears. An alienated and fearful public is the flipside of an isolated and purposeless elite.

Accordingly, the specifics of any particular issue are only a small part of what shapes the debate. Campaigners’ complaints about minute traces of persistent chemicals found inside their bodies are driven more by their sense of alienation from the decision-making process than by any real grasp of chemistry. They extrapolate from experiments upon rodents, which not only have different metabolisms, but which are also subjected to huge doses of chemicals for protracted periods of time, precisely to see the worst that might happen.

Many of the concerns, such as those regarding so-called ‘endocrine disrupting chemicals’, can best be described as conclusions in search of data. Despite the Royal Society report that notes that such chemicals are just as likely to be beneficial, and despite the evidence that there are millions of times as many of such substances in our foods as in any chemical we are likely to be exposed to, the decision to assume the worst now drives policy.

It is the authorities themselves that are in the vanguard of driving the agenda. Greenpeace, Friends of the Earth and World Wildlife Fund (WWF) may act as catalysts, but it is the European Commission, the Royal Commission on Environmental Pollution and even many chemical producing companies themselves who, desperate not to lose face in relation to what they assume to be public opinion, are prepared to push through the new cautionary policies.

Far from stabilising matters and reassuring the public such actions will drive public concerns and also shape a far more unstable regulatory environment. Rather than challenging the public with the evidence, the new elite, lacking any purpose of its own, is happy to appear to provide protection by pandering to social fears.

Our obsession with preventing the unthinkable lends itself to distracting us from more likely sources of risk, thereby diverting social resources from more plausible sources of threat at the same time as alarming people needlessly. But the drive to be seen to be taking precautions now determines all. It has allowed groups to pose as champions of consumer welfare, of animals, the environment or future generations, despite their being unelected and unable to represent the dumb, the inanimate or the unborn.

By raising problems at a time when these are in decline, and by positing widespread tests that are neither desirable nor achievable, the authorities make matters worse rather than better. And by adopting politically expedient, yet ultimately intellectually cowardly policies, they display their ultimate contempt for those that they claim to be fighting for.

First published on spiked, 16 November 2004

Review: Cellular Phones, Public Fears, and A Culture of Precaution

Risk analysis today falls broadly into two opposite, methodological camps. Those who appeal to scientific evidence to explain or critique what they consider to be exaggerated public fears, and those who focus on sociological data to highlight people’s perceptions and hence seek to justify a more precautionary outlook. While most recognize that risk contains both a material element and a perceptual element, there is rarely a meeting of ways in their methods of analysis.

This is where Adam Burgess’ contribution to the debate is to be warmly welcomed. Rather than falsely comparing the statistical risk of one activity with another, as many in the scientific camp are prone to doing, Burgess, a lecturer in sociology at the University of Bath has produced an explicitly sociological analysis. But rather than taking people’s perceptions at face value he seeks to explain how these perceptions came to be constructed in the first place, thereby, challenging these and critiquing precaution.

Read the full review (pdf)

Cellular Phones, Public Fears, and a Culture of Precaution, Risk Analysis,
Vol.24, No.4, August 2004, pp.1066-1068