Why does the public find a nuclear waste repository risky? Aisling Irwin on the new academic field facing up to a tangle of sociopolitical factors.
The people at Yucca mountain in Nevada State, in the United States, have been fighting for years against the threat that a nuclear waste repository may be built where they live.
The bitter argument between the people and their government has become intractable. This is due, some say, to a combination of Nimby (not in my backyard), Nimtof (not in my term of office) and the remarkable ability of politicians to discover that "the best geology turns up in the areas of worst political representation", according to sociologist William Freudenburg of Wisconsin-Madison University, Wisconsin.
The impasse is just one example of what American experts are calling "technological gridlock" - a wide and spreading problem in the US today. The experts highlight the paradox that in the westernised world we live in a time of unparalleled good health - and yet the public is developing a vision of a world that is bubbling with toxicity in rivers, air and soil.
中国A片
Life expectancy is at its highest, with the average, for US men and women in the 70s - but worries about health hazards are increasing. Alongside this growing concern comes increased opposition to the siting of oil platforms, waste incinerators and nuclear waste repositories. This, say scientists, is despite the fact that the incinerator in question may blow nothing but steam from its long chimney and the nuclear waste repository may be so carefully constructed that it would be less dangerous than a new highway.
According to James Short emeritus professor at Washington University: "In all too many cases, technological trainwrecks have taken place after the investment of millions or even billions of dollars on the hardware issues involved, but with little if any attention to the human issues."
中国A片
He was speaking at a recent meeting of specialists in a relatively new field - which examines how and why the public perceives risk. The meeting at the American Association for the Advancement of Science conference in Atlanta, Georgia was called to discuss what can be done to escape from the gridlock of public opposition to technologies. The symposium was billed as a first step towards looking for a new approach to, and perhaps even a new paradigm for, the public's perception of risk.
The gridlock has occurred despite the fact that the study of risk perception has made great strides over the past 15 years, progress documented in The Royal Society's comprehensive 1993 report, Risk: Analysis, Perception, Management.
The subject was born because of the sterility of old ideas about the public understanding of risk. One such old idea has been dubbed by one academic the "phlogiston theory of risk". It drew a clear line between objective risk, derived by scientists from statistics, and subjective risk, a less credible set of beliefs held by the public. Objective risk is a unique substance given off by a physical process at a rate which can be determined precisely by risk assessment, it was said.
Social scientists have now shown there are many practical reasons why it is impossible to develop a single clear-cut assessment of risk. For example, there are at least 10 acceptable mathematical ways of portraying risk. And there are great difficulties in comparing death, chronic illness, effects on the environment and long-term social costs without using value judgements that erode objectivity.
That public views are irrational is another out-dated belief about risk perception. This idea has been replaced by the conviction that the public's reaction provides, rather, an interesting set of data to be investigated. Sociologists have decided that the public's concept of risk is informed by much more than the possibility of a fatality. Research has showed that people are less worried if their exposure to a risk is voluntary (for example, smoking) or if they have had personal experience of the risk (because there is a fear of the unknown); they are more worried by exposures that threaten future generations and by accidents that are the result of human failure rather than of natural causes.
Another theme of the past decade has been risk communication: how to present assessments of risk to the public. It is a burgeoning subject, but, says Paul Slovic, a leading American specialist in risk perception, "despite some localised successes, it has not stemmed the major conflicts or reduced much of the dissatisfaction with risk management".
Freudenberg says that risk communication has led to some useful discoveries - for example, in honing assertions about risk so that they are more believable. "Our cars are safer than peanut butter" will not reassure, he says. Neither will a scientist saying "don't let this worry you unnecessarily".
But the reason for the American academics' frustration is not that their research is going nowhere. Rather, it is that recent research is revealing that risk perception is part of a tangled mass of sociopolitical factors: status, gender, race, world view, view of nature. They are unearthing a network of causes for the "technological gridlock" that is so complex that it is impossible for them to return to industry or government with the message that a few tweaks to the track will enable the train to run smoothly again. It is because of these new findings that they are calling for a revolution.
中国A片
Slovic, who works for a company called Decision Research in Oregon describes several studies that show, for example, that women find hazards more risky than men do. It is possible that women are not as familiar with technology as men are, although this does not account for their higher fears of the dangers of drugs or sun-tanning. And Slovic has undermined this explanation with studies of male and female toxicologists who would be expected to have equal familiarity with science - the division between men and women remains. There are also biological explanations for the difference. One such theory is that because women give birth they are more concerned about human health.
But traditional explanations have been overturned by a study of 1,500 Americans which discovered that the male/female differences vanish in non-whites. Instead, the researchers found that there is a group of white men that perceives risks as low - and there is another group, to which everybody else belongs, that perceives risks as higher. Not all white men belong to this group. Those who do tend to be better educated, have higher household incomes and are politically more conservative. To find out why they differ, researchers asked the subjects about their wider views on the world.
The separated white men were more likely than the others to agree with statements such as "if a risk is very small it is OK to impose it on individuals without their consent" and "we have gone too far in pushing equal rights in this country". They were more likely to disagree with the statements that "the world needs a more equal distribution of wealth" or that "the public should vote to decide on issues such as nuclear power".
It appears that the people in the decision-making jobs, the people who have more control, are the people least afraid of risks. "In sum," say Slovic and the co-authors, "the subgroup of white males who perceive risks to be quite low can be characterised by trust in institutions and authorities and a disinclination toward giving decision-making power to citizens in areas of risk management."
中国A片
Slovic says that these findings throw some light on why risk communication often fails. "Inasmuch as these sociopolitical factors shape public perception of risks, we can see why traditional attempts to make people see the world as white males do, by showing them statistics and risk assessments, are unlikely to succeed."
Back at Yucca mountain, a simple survey has demonstrated why it may be useless to interpret the public's risk perception in terms of its scientific understanding of an issue. Sociologists have examined letters to newspapers written by the people of Nevada throughout the time of the US government's tinderbox proposals. As time went on, the riskiness of the nuclear waste repository to health and the environment was discussed less and less. "Risk almost became a third or fourth consideration," says Roger Kasperson professor of geography, Clark University, Massachusetts. "What emerged instead was a debate about fairness, who has the power and who is doing what to whom."
Similarly, Slovic has found a factor so powerful that it can overwhelm findings based on gender or race. An example comes from the replies of toxicologists and lay people to questions on whether animal tests are reliable at showing whether or not a chemical may cause cancer.
Slovic found that if the animal tests produced good news (no cancer found as a result of administering the chemical) the toxicologists would say that the results should be heeded because animal tests are reliable. But if the tests gave bad news, the toxicologists were likely to say that animal tests could not be relied upon. The public's reaction was also asymmetrical - but in the other direction. If animal tests show no evidence of cancer, the public thinks "who can rely on animal tests?" But if they do show evidence of cancer, then, suddenly, animal tests appear to be much more reliable sources of information.
These and other findings have led to scrutiny of the role of trust in risk perception. Sociologists are discovering that it has enormous influence. The key principle is the "asymmetry of trust". (It is far easier to destroy trust than create it.) Negative events (accidents, for example) are more sharply defined than positive events (for example, a list of things that did not go wrong in the running of a nuclear power station). Sources of bad news are more credible than sources of good news (a campaigning group versus the government, for example); and there is a powerful news media that emphasises trust-destroying news.
Therefore, a single study that finds a link between cancer and electromagnetic field radiation carries much more weight than 20 such studies that fail to find a link. "Thus," says Slovic, "the more studies that are conducted looking for effects of electric and magnetic fields or other difficult-to evaluate-hazards, the more likely it is that these studies will increase public concerns, even if the majority fail to find any association with ill health."
"In short," he argues, "such risk assessment studies tend to increase perceived risk."
The recent findings of these social scientists suggest two opposing ways forward. One of these is to do fewer risk assessments, to repress information and to move to a more centralised type of decision-making. Slovic compares this to the French model. France produces nearly three quarters of its energy from nuclear power and was rocked by anti-nuclear protests in the late 1970s. The protests were repressed by the state. Today, the French perceive the risks of nuclear power to be as high as the Americans do. But the French also have a high degree of trust in their government and in the experts who run the nuclear plants. Americans, on the other hand, have more of a feeling that they can control risks to their health and have much less trust in the establishment.
The French system could be regarded as successful. But, Slovic argues, repression of democratic input and protest is probably not achievable in the US for historical and cultural reasons. Because of this, Americans must opt for an alternative route - towards more openness and involvement of the public in making decisions about risk.
Calls for more democratic siting processes in which the public feels it has control are among the least daunting of these proposals. Studies have shown, for example, that promising local people the power to shut down a plant increases the likelihood that they will accept it.
Howard Kunreuther, economics professor at the University of Pennsylvania, has studied the role of compensation in the siting process. He found that for more dreaded risks, such as nuclear waste storage, offering compensation can actually decrease the likelihood of the public accepting the risk, probably because the offer of compensation decreases trust by making people uneasy. He says that siting procedures must be revamped "to induce more legitimate decision-making regarding the type of facility to build and where to build it".
Kasperson agrees, arguing that a more democratic approach to risk management may in the long run be the cheapest way of producing a consensus.
The most daunting suggestions that these social scientists put forward are that the democratic process should be changed so much that "the degree of openness and involvement with the public goes far beyond public relations and "two-way communication" to encompass levels of power-sharing and public participation in decision-making that have rarely been attempted", says Slovic. But he adds that the rebuilding of trust in American institutions, the key to tackling the public's perception of risk, could take a very long time.
Whatever effect it has over the next few years, the work of these social scientists has shifted the problem far from what it was originally perceived to be - a need to go out and educate the ignorant layperson about science and statistics.
中国A片
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰’蝉 university and college rankings analysis
Already registered or a current subscriber? Login