Skip to Navigation
Skip to Content
Check back each month for new topics and responses

Share EmailFacebookTwitter
Share on Facebook
Cancel
Share on MySpace
Cancel
Share on Twitter
A short URL will be added to the end of your Tweet.

Cancel
Share on LinkedIn
Cancel

About

Welcome to PSR's Environmental Health Policy Institute, where we ask questions -- then we ask the experts to answer them. Join us as physicians, health professionals, and environmental health experts share their ideas, inspiration, and analysis about toxic chemicals and environmental health policy.

Topics

More Topics »

Making Precaution Our Default Policy

By Lin Kaatz Chary, PhD MPH

This essay is in response to: How can we set science-based policies in the face of scientific uncertainty?

Many of us in the public health community, and especially those working in environmental health professions, have come to the conclusion that our current approach to addressing scientific uncertainty, which is inevitable, has not been successful and must be changed. There is a growing recognition that the cornerstone of our approach to scientific uncertainty must rely on the fundamental public health principle of primary prevention. When faced with uncertainty in the assessment of chemicals that have the potential for irreparable harm, sometimes over generations, the default position must be precaution.

Uncertainty is a part of life. We are constantly required to assess hazards and risks in our daily affairs, are constantly weighing benefits vs. consequences, and often need to make choices without complete knowledge about the outcome. In many cases this process is routine, and is recognized as a fact of life. Recognizing this, our language is filled with adages that urge caution when we have incomplete information: "better safe than sorry", "look before you leap," don't count your chickens before they hatch", "a bird in the hand is worth two in the bush". If we stop and think about it, we know that we base our decisions not on the assumption that hazard and risk don't exist, but on the understanding that not all hazards and risks are equal. Before we act, we assess the evidence we have and base our decisions on several key factors: do we have enough information to proceed? How daunting are the known hazards? Are the results worth the risks? Are the outcomes permanent – if we decide incorrectly, will the consequences be fixable or irreparable, temporary or permanent, mild or severe, inconvenient or life-changing? Are there better alternatives?

As a society, however, when it comes to the larger decisions that affect everyone, we have been acculturated to the belief that there is little that modern science can't know, or find out, and that, as a result, there is little we can't control from "better living through chemistry" to finding a cure for cancer. As a society, and particularly in an economic system that relies on predicting the future as accurately as possible, the problem of uncertainty in science can be difficult to tolerate. The solution, currently enshrined in current policy on toxic chemicals, is that it is better to make what is called a "Type 2" error, which simply means that in the face of uncertainty and missing data, it is preferable to err on the side of having a false negative – that is, it is better to assume a negative effect (no effect) from a chemical which is actually hazardous, than to make a "Type 1" error, or false positive, which assumes harm (positive effect) when the chemical is really safe.

The problem is that when we are discussing environmental health, and exposure to toxic chemicals in particular, giving chemicals a "clean bill of health" and letting them loose into the environment when they are actually quite risky is a backward-looking, reactive approach that transfers the burden of risk onto the general public, and creates a situation where exposed populations become the guinea pigs on whom the effects of chemicals are tested in real life. Once the health effects begin to be known, it's too late. Millions of people have been exposed, and while it may be possible to remove the substance from use (a very rare event, it should be noted), people who have been exposed cannot become unexposed. The damage is done.

Policy makers, chemical manufacturers, and environmental health workers are currently in the midst of a very intense debate over how to deal with scientific uncertainty, including conflicting outcomes from studies, and substances whose effects do not conform to traditional linear dose-response relationships. The subject is endocrine disruptors, a class of chemicals whose numbers are growing as it receives more attention in the lab and in the policy arena. And the crux of the question is how to act when the regulated community believes there is inadequate information, and conflicting data about endpoints, toxicity, hazard, and risk. When are there enough data to come to a credible scientific consensus and then move forward with policies and regulation to either allow use, with or without limits and controlling exposure, or to remove the chemical from production and use? Both the amount of data and the quality of the data are critical factors, but how is the decision to be made?

The answer to these questions has very significant implications both for manufacturers and the public, and the consequences of acting either prematurely, or too late can have significant impacts both for the market and the public health. What is at stake is the future of many substances that are currently ubiquitous in the marketplace, such as PVC – polychlorinated vinyl plastics, Bisphenol-A (BPA), brominated fire retardants, and perfluourinated compounds used in non-stick coatings in many products including both food and utensils. While an increasing number of data show that many of these chemicals are being found in the tissues of both wildlife and humans, and that several endpoints have been identified in laboratory animals exposed to these chemicals, there are also some data which show no associations with exposure. Of greater concern to manufacturers, some policy makers, and some public health professionals, is the similar lack of data demonstrating unequivocal specific impacts in humans correlated with the existing exposures measured in human tissue and blood. Does just the presence of these chemicals in humans signal a risk, and an indication of pathology? Or does the absence of epidemiological studies clearly associating these exposures with specific health endpoints signal no need for concern? 

Complicating the picture, many researchers in the field point out that current levels of these chemicals in humans are the equivalent of those seen in laboratory animals where significant health effects have been documented. Can these data be extrapolated to humans? Should they? Are there enough data to justify removals from the market and/or bans on further production and use?     Removing these chemicals from use would have an enormous economic impact, not to mention the costs and time required to find safer alternatives and bringing them market. It is not surprising, therefore, that the manufacturers, and the politicians who represent them, are demanding a very high standard of scientific certainty before condemning these chemicals as unsafe, arguing that existing data on health risks are inadequate to justify not only the known economic consequences, but also in many cases, real problems from the withdrawal of certain products if there are no acceptable alternatives to take their places. 

We are used to basing our policy decisions on assessments of exposure and risk associated with exposure, based on the assumption that if there is no route of exposure, there is no risk. What has become increasingly clear however, is that determining routes of exposure of many chemicals – like endocrine disruptors, for example – and understanding what amounts of exposure are relevant is a far more complex effort than originally believed. We now know, for example, that not all chemicals exert their effects in traditional dose-response linear relationships. For some chemicals, the timing of exposure for the fetus is more critical than the amount of exposure, and sometimes very low doses are more damaging than higher doses. Some chemicals do not cause obvious cellular changes, or disease processes, but affect genes – not as mutations, but by interfering with the expression of the gene. Some chemicals have effects that are seen not in the exposed individual, but in her offspring, and even in subsequent generations. New discoveries regarding hazards and exposures require new ways of evaluating what risks we are willing to take while we debate how much uncertainty we can tolerate. Because while these debates are going on, and these investigations continue, our children and our families are still being exposed. There is no uncertainty about that.

This is a function of the fact that current environmental health policy is deeply reactive and rooted in a model allows exposure to chemicals based on a quantitative risk assessment which determines what is an "acceptable" level of risk, and takes for granted that one or two "additional" cancers out of a population of 100,000 or 1,000,000 is the best that can be done. Hazard is a given, and exposure is expected to be controlled and minimized through pollution mitigation strategies such as permits limiting discharges and effluents, emissions control equipment such as scrubbers on incinerators, chemicals use restrictions, and so forth. Risk and exposure are considered "managed", and everyone goes home happy – except for the individuals who develop the additional cancer, or, more likely, one of the many chronic, debilitating conditions considered acceptable at this level of exposure.

More importantly for this discussion, this exposure management/risk control approach finds a parallel expression in how scientific uncertainty is handled. When there is uncertainty, the default approach is to assume "innocent until proven guilty", the burden of proof is on the exposed population, and, if mistakes are made, they will be Type 2 errors, the false negative – assuming the negative effect in the sense of no harm/ low risk when the opposite is true. The consequences of Type 2 error are far more favorable to the marketplace, particularly because the true cost of these errors are externalized to society at large, rather than on the manufacturers, users, and regulators. This approach is clearly designed to favor Type 2 error. In this paradigm, there is no room for Type 1 error, the false positive – asserting harm/high risk when it turns out there is little or none; it has been purposefully designed out of the system.

This paradigm must change. We must redesign our chemicals policy to be proactive, to favor erring on the side of precaution when there is uncertainty, and to making precaution the default position where there are missing data and conflicting results which prevent clear conclusions. We should rely on the weight of the evidence when there are unanswered questions and imperfect data. This means the willingness and capacity to take action to protect the public before we have perfect or complete information. This is not without precedent. Decisions on two major public health threats were made long before we had complete data – the risks of disease from smoking, and the dangers of the continued manufacture and use of PCBs. On the other hand, decisions that should have been taken on the dangers of polybrominated biphenyls (PBBs) back in the 1970's were not made, with the result of thirty years more of exposure before octa- and penta- PBDE were finally taken off the market, with deca-PBDE also signaling serious effects. Why are we willing to continue gambling on the well-being of our children and our environment when we have enough evidence to recognize significant threats? 

Implementing the precautionary approach requires immediately adopting three additional concrete strategies: 1. shifting the burden of proof of safety onto the manufacturers of chemical products and processes; 2. activating a policy of "no data, no market", which means that chemicals lacking hazard data about their effects on human health and the environment cannot enter the marketplace; and 3. promoting the development of safer alternatives and substitutes. None of these is beyond current regulatory or technological capacity. All of them directly address the problem of scientific uncertainty, and none interferes with either the freedom of the market or the freedom of scientists. To the contrary, we can anticipate that a proactive effort to deal with scientific uncertainty should stimulate both innovation and the market.      

Adopting a precautionary approach, and placing a greater emphasis in our regulatory decisions on the inherent hazards of chemicals rather than on exposure and risk is not only the most rational and effective approach to protecting the public's health, it is an expression of the most deeply held value of public health principles and practice – primary prevention. In an imperfect world, we will always have to make choices in the face of incomplete information. Our task is to make the best choices we can based on the information we do have, and not to be afraid to exercise precaution in the short term to assure the best outcomes over the long term.

Read the next response »

Comments

Leave your comment

Name
Comment
Enter this word: Change