The final plenary session of this year’s DSA sees Sheila Jasanoff of the Kennedy School of Government, Harvard, founder of the Science and Democracy Network and a STEPS Centre Advisory Board member talking on one of her specialist subjects – rethinking risk and regulation. (Photo: Sheila Jasanoff)
Jasanoff, fresh off a plane from Boston, quips that any lack of coherence that we hear in her speech is a deliberate normative position to leave things desirably open-ended. She then launches in to a quick run through a catalogue of risk from 1976 Seveso accident in Italy through to Bhopal in 1984, Chernobyl in 1986 and hurricane Katrina in 2005 with several stops along the way.
And over this 30 year history of risk, law, politics, policy and scholarship has attempted respond to risk, starting with the US Supreme Court’s ‘benzene decision’ on occupational exposure 27 years ago (1980) to the Court’s decision against the Environmental Protection Agency on greenhouse gases earlier this year.
It is possible to draw a distinction between the technocratic approach and the social-culture approach to risk. At the centrepiece of the former is risk defined in terms of probability of harm and magnitude of harm. The risk assessment is then communicated, economic trade-offs are assessed, public perceptions of risk are measured and new management institutions are established.
Central to the social-cultural approach is putting risk in context – how risks are recognised, distributed and why some risks are not acknowledged. The cultural dimensions of risk are investigated – why are there differences in risk perception and acceptance across nations and how does politics affect recognition, assessment and management of risk.
“When we think about major events in the world technocratic approach has produced some remarkable areas of blindness. One is the way in which risks were or were not seen in run up to 9/11,” said Jasanoff. “Although there had been previous attacks on the twin towers, they were understood as one thing alone – an office building – and not as objects that participate in other kind of networks…Nor were they seen as targets for military action nor was there symbolic attraction thought about by risk assessment.”
And when discussing the stresses that the towers could withstand, structural engineers had not thought about a fully-loaded plane – full loaded with fuel or people. Meanwhile airport security was ‘fighting the last war’ and suicide attacks with planes were not imagined, although they were forewarned. “Risk assessments are only as deep and rich and good as the imagination of the people sitting around the table and those imaginations are often incredibly constrained,” says Jasanoff.
Its interesting how powerful the technological fix remains, she says, citing the US Challenger disaster and UK BSE crisis (the first time around) as examples. 9/11 has also been translated in to a number of technological solutions – the war on terror, airport security – as has Hurricane Katrina – building higher levees and the ‘de-concentration’ of poverty, a terms that sees poverty as something that can be put in solution to dissipate it.
There’s something wrong with the term risk, thinks Jasanoff, it relies too much on a single point of origin from which measurable outputs emanate. Should we not be thinking about risk hand-in-hand with theories of political economy and geo-politics? A sense of scale is also often missing from risk assessment – for instance the large numbers of farmer suicides in India.
So how should we think about the agenda of risk analysis, especially in the development studies context? The technical approach does not make sense unless the social is kept in mind. Most risky situations and disasters are always hybrids in that sense. But he social dimensions of risk are the least studies and understood. The boundaries of risk need to be re-examined: boundaries between kinds of risk, between analysis, management and prevention and between ways of knowing and understanding risk.
And finally the normative dimension needs to be restored: how do risks affect people and who has the power to create new risks? Who is able to participate in, influence and act on controls and how should history matter in regulation?