|ICRP recommendations show an important imbalance between a very prescriptive system full of figures of unbelievable accurateness and a scientific background that has not very much improved since the previous recommendations. Moreover, ICRP stay tied up to a very isolationist conception of radioprotection. This is certainly justified when radiological risk is the dominant one but certainly not in usual situations where this risk is merged in a background including many other kinds of stresses in addition to natural and added radioactivity. We can hence wonder if this system can easily been extended from workers to public and to environment.
The most serious scientific weakness lies in the dose concept. ICRP promotes a unique indicator in order to evaluate the radiological risk. She assumes that for a given value of this indicator all situations are equivalent as regards the risk. This implies giving a very important place to the dose corresponding to the natural background.
It is true that the concept of an indicator based upon the energy release in tissues had in the past some successes in specific cases. But the question is to determine if an extrapolation is possible in very different cases. Some simple arguments and some analogies are in favor of doubt.
Dose seems to work well when the freedom number is close to one. This is the case of a short and intense gamma irradiation where the source is outside its target since in this case, all constituents of the target tissue (if not too thick) have an equal probability to be affected by ionizing. This probability distribution being uniform, it seems likely that a unique parameter may summarize the whole information. The flux from the source and the type of target tissue are then enough to characterize the situation. It is hence not surprising that in such cases, a good correlation may be established between the indicator and the real effect.
The situation becomes more complex when we add some more freedom degrees to the system. Even in the "simple" case of gamma exposure, addition of the time parameter (flash exposure or exposure given in a long time) is a fundamental change of the situation. The rare available experimental data show that a purely gamma exposure with a constant dose rate begin to imply visible consequences only above important dose rates. For example a recent experiment was conducted in CEFAS-Lowestoft with woodlice. After a 14 weeks exposure a slight effect was found only above a dose rate of 8 mGy per hour which is nearly a factor 100000 above the gamma component of the usual background. Such figures tend to support the idea that gamma exposure with rates similar (or even two ranges of magnitude above) to that of environment is unlikely to be a real risk factor that should be specifically addressed in a radioprotection system. Moreover it invalidates the use of natural background as a reference since, precisely, in the natural background, the constant gamma component, likely ineffective, is an important part of the total dose.
Situation is even more complex in the case of exposure from internal sources since then supplementary parameters have to be added in order to describe the micro distribution of energy release in tissues. When energy is released locally, the neighborhood of the source point will necessarily be submitted to a more important radiation flux (it varies roughly as the inverse of square distance) than far zones. This effect will be higher in the case of short range emission. There are hence no reasons to fear such a situation in the case of gamma radiations, even coming from internal sources. On the other hand, it may happen in the case of alpha and beta radiations and particularly for very short range radiations coming from Auger emitters. Anyways, if the radioactive emitter is evenly distributed, all constituents of the target tissue and even all cellular sub-structures are indistinguishable as regards the risk. A dose based upon the mean energy release seems then a likely concept, especially if a correction (Wr) is done according to the kind of radiation. On the opposite, and this is where is the real difficulty, if the chemical state of the emitter is such that it links with some specific targets, then, the parameter "selectivity" has to be introduced in the building of the exposure indicator. This means that a relation between concentration and effect has to be derived for each radionuclide and for each possible chemical state. It causes the loss of the simplicity of Sv and it implies an important experimental work to validate these relations.
The previous remarks are not purely theoretical since some available examples in the chemical field demonstrate clearly that the importance of the toxic selectivity may be far more important than the energy release. For example the LD50 lethal dose of the botulic toxin is around 1 ng/kg. Since it is a very big molecule (around 100000 Dalton) the number of molecules necessary to cause an effect is very small and the energy involved in reactions does not even reach 1nJ/kg ! If we calculate Gy as we would do for radiation impact, (a chemical reaction is nothing else that bond breaking and bond formation) we would find lethal doses as low as 1 nGy! This clearly proves that the applicability of the Sv concept is more and more questionable when the toxic selectivity is increasing. A special attention should hence be given to cases where the two unfavorable characteristics are present: small radiation range and strong chemical affinity with the most sensitive targets.
As we noticed above, natural radioactivity is not a convenient reference since it includes an important proportion of low dose rate gamma radiation. If we exclude radon (which is not completely natural since it depends upon air renewal in dwellings) other main components of natural radioactivity that operate by internal pathways do not share the selectivity problem raised above. As well K40 as C14, which are the two main contributors to internal dose, have a very homogeneous repartition that excludes that energy release may be concentrated on very tiny spots. It is hence quite credible that natural radioactivity be ineffective, which is not an ideal characteristic to be used as a reference!
We can as well wonder about the concentrations given by ICRP to define the practical threshold of non radioactivity ("exclusion level"). We cannot support simultaneously the idea that natural and artificial Sv are equivalent and accept different concentrations for alpha emitters according to their natural or artificial origin. The previous arguments lead to think that, in situations of internal contamination, the relevant parameter is not the origin but more likely the affinity of the radioactive emitter with some specific targets.
In addition to the scientific problem, we have to solve the practical problem of how to manage the risk in the uncertainty context that is inherent to these kinds of situations. As regards this point, it seems to me that a management that acknowledges more clearly the lack of data would be easier to explain. The ICRP method starts from some cases where effects are known, extrapolates to derive dose effect relationship, assumes a socially acceptable effect threshold and finally derives a dose threshold to be used in regulations.
An alternative method could be to start from the same cases where we see effects and to take a safety margin to avoid the risk to reach situations where the effects happen. This is what is done in toxicology where safety factors of 100 or 1000, relative to laboratory tests, are usual. The advantage of this last method is that we have not to claim that we understand what happen around extremely low levels but we compensate this lack of knowledge by the margins. It is interesting to notice that the final result is no so different from ICRP recommendations since roughly, we observe effects starting from some hundreds of mSv, which with a safety factor between 100 and 1000 sets the dose limit between 0.1 and 1 mSv.
This method has some pedagogic advantages:
1 It does not contribute to maintain the false idea that knowledge is precise enough to allow to do subtle optimizations
2 It gives the reference status to situations where the detriment can be observed rather than to situations where it is hypothetical
3 It reminds that in risk management, there is always a bet to live with. Nothing can prove that a not yet been discovered and particularly severe case cannot exist (if the botulic toxin can operate with 1 nJ/kg, why some very exotic radionuclides could not do the same?). When reaching the limit of knowledge, the safety margin depends on the advice of an expert that ideally should have as wide as possible knowledge of similar cases, including biological and chemical stressors.
In a situation where the radiological risk (save accident) is neither a major problem nor an increasing problem (since the radioactive contamination of biota in a country like France is decreasing) it would seem logical not to modify, without strong arguments, a regulatory system that is already highly protective. It would be more useful to orientate the work force toward basic research in order to gain a better understanding of the most unknown situations. It is more a research problem than a matter of overregulation or overmonitoring. We can congratulate ICRP to add environment in the list of what should be protected. It would be wise to take this opportunity to reorient radioprotection toward the true and open problems that are bioconcentration processes in ecosystems or in sub cellular components exposed to chemically selective radioactive compounds emitting very short range radiations. Simultaneously, it would be worth to restrain the easy option of regulating and measuring all that is easily measurable (especially gamma exposure when the dose rate is low and when there is no indications of inhomogeneity) without really wondering about the utility to gather so many data.
PS: This is the English version of the text submitted in French on 25/11/2004. Of course in case of misunderstanding due to my bad english, refer to the French version.