[Part I of this series appeared here]
Today, we discuss types of radiation and their health effects. If you’re learning something, it’s usually a good idea to take several passes at the knowledge. On the first pass (engineers would prefer ‘noughth’), you get the 5 second summary: “fire is hot;” “life forms have a common ancestor;” “the brake pedal is on the left.” Then you can go back and fill in the gaps in your knowledge on second, third, etc. passes, with a popular, basic summary, followed perhaps by a textbook or a course.
If we’re talking about the health effects of ionizing radiation, the first-pass lesson is “radiation is dangerous — minimize exposure.” Well, that’s better than nothing. But if we take one more pass, we might be able to get to a more useful understanding. What kinds are most dangerous? How much is too much? How can we minimize exposure?
By learning a bit more, we can avoid both underreaction (due to unrecognized dangers) and overreaction (due to inappropriate fear). By analogy, imagine if nobody had any clue that gasoline was flammable, but we all ran screaming away from votive candles!
It’s hard to present this material in a nice logical order, because there are many interdependencies. So I’ll just start somewhere.
Types of ionizing radiation
To call radiation “ionizing” is to say that it has the capacity to create ions in materials it comes in contact with, typically by (directly or indirectly) ripping electrons off of atoms. There are three commonly encountered types of ionizing radiation, referred to as α (alpha), β (beta) and γ (gamma).
Alpha radiation occurs when an unstable nucleus decays and ejects a high-energy alpha particle. It turns out that alpha particles are simply Helium nuclei: 2 protons and 2 neutrons. Accordingly, when an element undergoes an alpha decay, the result is an element with two less protons and two less neutrons: for example, Plutonium-239 alpha-decays to Uranium-235.
Because it has a charge of +2, the alpha particle is highly ionizing and therefore quite dangerous. However, alphas have a very short range in air (a few tens of millimeters) and are very easily blocked by, for example, a single piece of paper, or the layer of dead skin covering a person’s epidermis. Accordingly, external exposure to alpha particles is relatively safe — you could pick up a large sample of an alpha source such as plutonium with your bare hands and expect no ill effects. However, internal exposure via ingestion or inhalation of alpha-emitters is very dangerous, since inside our body there is no alpha-stopping barrier to protect our cells. You may recall the murder by poisoning of former KGB officer Aleksandr Litvinenko in London; he was killed by acute radiation sickness from ingestion of Polonium-210, a strong alpha emitter (see below for more on radiation sickness).
Beta decay comes in two varieties: β- and β+. These represent the ejection from the nucleus of an electron or positron (charge -1 or +1), respectively. β- decay turns a neutron into a proton; the electron must be ejected to maintain conservation of charge. Therefore, a β- decay leaves the atomic mass number unchanged, but moves the atom forward in the periodic table: for example, Thorium-231 undergoes a β- decay to Protactinium-231. For β+ decay, just the reverse is true.
Beta particles also ionize, though not as strongly as alphas. However, they do penetrate farther, and therefore represent a more serious health risk for external exposure. Betas have a typical range in air of 5-10 meters (depending on their energy) and can generally be blocked by, for example, a sheet of foil. Accordingly, they are not too difficult to shield oneself against externally, provided one is aware that they are present. A good example of a beta-emitter is Potassium-40, the largest source of natural radiation in human and animal bodies. Because Potassium-40 gives us an internal exposure, it is pretty much impossible to shield ourselves against it.
Gamma radiation is the most penetrating type of ionizing radiation, and represents high-energy photons — in other words, high-energy light. Gamma decays can be thought of as secondary decays, for when an alpha or beta decay occurs, the daughter nucleus is often left in an excited state of excess energy. This energy can be released by emission of a high-energy photon — a gamma.
These photons easily penetrate the entire human body, and can only be effectively stopped by a significant barrier, such as a thick block of lead (exactly how thick depends on their energy). Gammas occur naturally in cosmic rays, but they also show up in another guise — as X-rays. The differing terminology reflects their sources more than any qualitative difference: X-rays are from accelerating electrons, while gamma rays are from nuclear decay. Given the lack of meaningful distinction, I will refer to photons from both sources as gammas.
A good example of a gamma source is the medical isotope Caesium-137. Upon decaying to Barium-137, the nucleus is in an excited state, and releases two gamma rays, which ionize the surrounding tissue. Because cancer cells are more vulnerable to ionization than healthy cells, gamma radiation from decaying Caesium-137 is used for radiation therapy.
Alpha, beta and gamma radiation are the three most important and common types of ionizing radiation, and they all have one characteristic in common that is worth mentioning: they do not cause further radioactivity in the materials that they affect. This fairly elementary point is worth bearing in mind when, for example, you see controversy surrounding the use of irradiated produce. The radiation is used to kill bacteria, but it definitely will not cause your head of lettuce to become radioactive, any more than shooting somebody will cause them to emit bullets. However, it may induce subtle chemical changes that some groups claim (without much evidence) could be harmful.
The exception to this rule is neutron radiation, which can indeed transmute one element into another (usually radioactive) one in a process known as neutron activation. However, neutron radiation is very rare outside nuclear reactor cores. Typically, the only people who need be concerned with the health effects of neutron radiation are workers in nuclear plants — and then only when they are exposed to a chain reaction in progress. Finally, proton decays occur very rarely, in some natural elements, but are essentially irrelevant to discussions of nuclear safety.
Fallout refers to the release into the environment, not of radiation per se, but of radioactive sources. Unlike alpha, beta and gamma radiation, fallout does indeed contaminate affected materials with radioactive elements. Fallout may be from two main sources: nuclear weapons detonations (not the topic of these posts), and certain types of nuclear accidents, such as those at Chernobyl and Fukushima Dai-ichi.
A future post will look at what happens in a nuclear reactor more closely, but for the sake of discussing fallout we will simply note that when an atom of fissile material such as Uranium-235 breaks up in a fission reaction, the main products are (a) lots of heat you can use to make steam to turn a turbine, and (b) two highly radioactive fission fragments (for example, Krypton-89 and Barium-144 — the exact elements vary). You may recall from the first post the important fact that any randomly chosen combination of protons and neutrons is almost certainly radioactive. This applies to fission fragments, which are, in effect, selected quasi-randomly from the space of nuclides. These fragments, almost always highly radioactive, begin themselves to decay, and so do their daughters, until (eventually) the decays reach a relatively stable nuclide. After a reactor has been running for some time, these fission fragments and their offspring are highly concentrated in the fuel rods and the cladding that shields them.
In the event of a reactor fire, as for example at Chernobyl, fission fragments may be released into the atmosphere in smoke, resulting in fallout contamination over a large area downwind of the fire. Two of the most worrisome fallout particles are Iodine-131 and Caesium-137, due mostly to their ease of absorption in the body. Upon entering the body, these nuclides will emit radiation (almost always beta and gamma), causing internal radiation exposure.
Why is ionizing radiation bad?
Ionization disrupts cell chemistry, with three important potential health outcomes:
- Radiation sickness (essentially large-scale cell death);
- Cancer (uncontrolled cell growth, due in part to mutation);
- Genetic abnormalities (in descendants, due in part to mutation).
Radiation sickness is associated with a single, large exposure to radiation in a short period, and its symptoms develop on a timescale of hours to months. In mild cases (doses of about 0.5-1.5 sieverts), it leads to symptoms such as nausea and depressed white blood cell count (leukopenia). More severe exposures (2-4 sieverts) result in neurological problems and some fatalities (usually due to destruction of bone marrow), and high doses (of 8 sieverts or greater) are uniformly fatal. In cases of external exposure to radiation, severe skin damage and hair loss can result.
Broadly speaking, we can say that radiation sickness is caused by large-scale cell death, and that the severity of symptoms is a function of the body’s natural ability to cope with this cell death in a timely way. Doses below about 0.4 sieverts typically lead to no symptoms because the body is able to repair that quantity of damage without too much trouble, provided a person is otherwise healthy. However, higher doses increasingly overwhelm the body’s ability to repair all the damage — hence, the onset of more severe symptoms and death.
Radiation sickness is often referred to as a deterministic effect, as opposed to a stochastic effect (in the case of cancer and genetic abnormalities). This is because the dose a person receives determines the severity of their symptoms, not its probability. If you gave 100 people a one-time radiation dose of 2 sieverts (more on what a sievert is below), you’d pretty reliably get 100 cases of severe radiation sickness.
By contrast, cancer and genetic abnormality risk is referred to as stochastic, because an increased dose of radiation (say, an extra millisievert per year) increases the probability that a cancer will be contracted or a mutation passed on — there is no guarantee of any effect at all.
Because the vast majority of cancers are caused by effects other than manmade ionizing radiation, and because there is no specific signature of radiation-induced cancers as opposed to all other cancers, it is usually extremely difficult if not impossible to say that a given dose caused a given cancer. The increased cancer risk due to radiation is an extremely weak signal in the data (exceptions sometimes occur; for example, thyroid cancer is characteristic of exposure to Iodine-131, and in those cases the statistical signal is much stronger). The exact relationship between doses and cancer risk is highly controversial, and is discussed below.
There are a lot of units that are used to talk about radiation, or have been used historically, but since our concern is primarily with radiation’s effect on human tissue, the most important of these is is the unit of equivalent dose, the sievert (Sv). One typically talks in terms of microsieverts (μSv) or millionths of a sievert, and millisieverts (mSv) or one thousandth of a sievert (older sources use the unit rem, where 1 Sv = 100 rem).
It is extremely helpful to get a feel for the range of doses resulting from various activities, foods, diagnostic procedures, living locations and lifestyles — and for their relation to (a) radiation sickness, and (b) cancer risk. For that purpose, I recommend perusing xkcd’s wonderful infographic, which visually communicates things much better than I can do verbally.*
Note a few important reference points (mostly taken from the UNSCEAR, via Bodansky, pg. 74):
- The vast majority of the cumulative dose (hence, cancer-generating dose) a person receives over a given year is from natural sources, about 2.4 mSv/year. This is mostly from Radon-222 gas, a decay product of natural Uranium and Thorium that is in the air we breathe.**
- The runner-up for cumulative dose is medical diagnostics, which averages out to about 400 μSv/year per person.
- The extra dose for the average citizen from the nuclear fuel cycle and nuclear accidents is — by comparison — miniscule: 2-4 μSv/year, or something like one part in 1000 of the total yearly dose.
- Radiation sickness does not generally occur below a threshold of around 500 mSv, but this is already a very large dose as compared with levels that might be received by the general public, even in the event of a fallout situation like at Chernobyl or Fukushima. Accordingly, radiation sickness is a concern for nuclear workers, emergency responders etc., but not typically for the general public, even in the vicinity of a nuclear accident.
- The lowest dose linked to detectably increased cancer risk is 100-200 mSv/year — also a relatively high dose.
Because the effects of large, sudden doses are well-understood, there is little controversy surrounding radiation sickness. However, the link between radiation doses and cancer is more obscure and controversial.
What we know with some certainty is that doses above 100 mSv are clearly linked to increased cancer risk, and that the higher the dose climbs above 100 mSv, the greater the risk. Epidemiological studies (for example, of the survivors of Hiroshima and Nagasaki) have led to differing conclusions, even when performed based on the same data. However, typical risk coefficients obtained from such studies are in the range of 0.05 to 0.10 per Sv — meaning, very roughly, an extra 5-10% chance of fatal cancer from each sievert of radiation absorbed.
The difficulty is that these risk coefficients are based on data from the very high doses typical of a nuclear bomb or severe nuclear accident. This is understandable, because in such cases, cancer risk becomes a stronger statistical signal. However, there is no clear data on what the risk coefficient might be at much lower doses, despite many inconclusive low-dose studies. Sure, 100 mSv increases your risk of fatal cancer by about 1%, but does 1 mSv increase it by 0.01%, as would be the case if the effect were linear (risk = coefficient x dose), and had no threshold (any dose, no matter how small, increases your risk)?
In practice, what has happened is that public health officials have assumed the linear no-threshold (LNT) model as true, largely because it is a conservative assumption on which to base policy. However, it is highly contested, with some suggesting a threshold relation (no bad effects below, say, 50 mSv), and others proposing beneficial effects at low doses (hormesis). I personally think the LNT model is pretty plausible (it fits with what I know about mutations — the Russian-roulette-with-lots-of-chambers model), but it is important to understand that it is just a working assumption. This is unfortunate, because our estimates of the number of people killed by cancers from, say, Chernobyl fallout, has to have truly massive error bars — somewhere between a few tens (thyroid cancer made these statistically detectable) to several thousand. However, because I think the LNT model is probably approximately right, and because, for us as well, it serves as a conservative assumption, let us provisionally adopt it.
The linear-no-threshold assumption — that even very low doses increase cancer risk — generates some ‘paradoxes’ of utilitarian public policy. These are centered on the extreme disconnect between a population effect (like 1000 excess cancer deaths) that looks significant, versus a personal risk (like an extra 0.001% chance of getting a fatal cancer) that looks negligible. The population effect is usually calculated as a collective dose, in units of person-Sv. For example, if 20 people receive 0.1 Sv each, that’s a collective dose of 2 person-Sv.
Suppose some disaster is about to befall a medium-sized city, which will cause a uniform, one-time dose of 100 person-Sv spread over all 1.2 million citizens, or 80 μSv each (of course, it’s never quite this simple, but work with me). Should public officials temporarily evacuate the city?
Well, according to the linear no-threshold model, (100 person-Sv) x (0.10 excess fatal cancer risk/person-Sv) = ~10 excess cancer deaths. Those are real people who didn’t deserve to die, and assuming the evacuation doesn’t result in any deaths itself, they could be prevented by evacuating the city. A naive public official might do so.
On the other hand, if I analyze it from my perspective as a single citizen, 80 μSv corresponds roughly to the extra dose from a couple of high-altitude airline flights, and comes out to about a 0.0008% excess risk of fatal cancer, assuming LNT. Considering those odds, I would definitely choose to stay (assuming I knew the estimated dose to be accurate).
Now, consider further that certain areas have higher background radiation — Colorado being the textbook example. The extra dose from living in Colorado is about 400 μSv/year, due mostly to natural Uranium deposits. With a population of about 5 million, that’s around 2000 person-Sv/year, or (by LNT) an expected ~200 excess cancer deaths per year! (In fact, the cancer rate in Colorado is lower than the US national average — I’m not sure why. Lifestyle?)
Obviously, nobody is proposing evacuating Colorado. Barely anybody has even heard that Colorado has a higher background level. And yet Three Mile Island, the most infamous of North American nuclear accidents, is estimated by the relevant authorities to have released a collective dose of around 20 person-Sv, total.*** We will talk more about TMI in a later post on nuclear accidents.
It is not my purpose to trivialize these issues. All of these numbers represent real (although mostly unidentifiable) people who (probably) died instead of living.****
However, when considering the health effects of radiation from the nuclear industry, you need to be damned sure you’re at least being internally consistent (perhaps by doing some rough math), and remember the ugly but important fact that everything kills people. Driving kills people. Molasses kills people. Owning kittens kills people. Nuclear power kills... not many people, to put it mildly.
So if you’re willing to commute 10000 km/year instead of taking the bus, blithely accepting the additional ~0.008% probability of death that that implies, plus the other drawbacks such as pollution, then all other things being equal it just doesn’t make sense to grandstand about the dangers of a garden-variety nuclear plant — unless driving is literally infinitely more fun than having power for your iPad and your grandma’s respirator.
It is especially insane if you do not simultaneously grandstand much more loudly about a lot of other things, like the coal plant that will, with near inevitability, replace the base load your nuclear plant would have generated. Even ignoring the other health effects of coal plants, they release at least as much radioactive material as nuclear plants, and usually more (fly ash contains radioisotopes like Uranium and Thorium).
...Boy, am I ever getting ahead of myself. A future post will discuss death rates per unit energy for the various power generation methods, as one useful figure of merit. But next time, we discuss how nuclear reactors work.
* However, the well-known ‘banana equivalent dose’ mentioned here is disputed, and probably a fair bit less than 0.1 μSv. This is a bit of an old chestnut and people promoting nuclear power really ought to stop quoting it.
** Some sources, especially anti-nuclear ones, cite a lower (wrong) figure of 1 mSv/year. This is based on ignoring the effects of Radon-222, apparently in order to make doses due to nuclear power seem larger in comparison.
*** However, this is disputed by anti-nuclear folks; see e.g., Caldicott, p. 65. Several claim that doses were high enough to cause radiation sickness in multiple victims. See Wing, who lists the sources.
**** One major problem that I have with popular works critical of the nuclear industry, particularly Caldicott’s book, is the absence of relevant qualifiers about probability. For example, on pg. 61, Caldicott says of Plutonium that it “is so toxic and carcinogenic that less than one-millionth of a gram if inhaled will cause lung cancer.” Of course, a statistically sophisticated reader will balk at the “will” in that sentence, such an unqualified intimation of certain death, even if they know nothing about Plutonium. But many of Caldicott’s readers likely swallow this whole. In fact, according to a paper from Lawrence Livermore National Laboratory, the actual excess fatal cancer risk resulting from inhaling 1 μg (quite a bit of Plutonium) is about 1.2%.
* David Bodansky, “Nuclear Energy: Principles, Practices and Prospects.” 2004, Springer.
* John R. Lamarsh & Anthony J. Baratta, “Introduction to Nuclear Engineering.” 2001, Prentice Hall.
* Helen Caldicott, “Nuclear Power is Not the Answer.” 2006, Westchester Book Group.
* A useful source on the relation of dose to cancer risk (note: 1 Sv = 100 rem).
* On coal vs nukes.