Is Radiation Actually Good For Some of Us?
By age 10, most people are exposed to enough radiation to be at risk, but the science is so complicated that exposure could even have benefits.
Meet Reference Man, a kind of hypothetical Ken Doll: a 20-something white male, fit and hearty, out in the park doing a hundred one-armed pushups every morning at 5:30. He’s the guy most radiation exposure standards are designed to protect. But as a stand-in, he’s passé.
Reference Man was born when most of the evidence about the health effects of radiation came from high-dose exposures such as nuclear bombs. But the landscape has changed. Exposure now comes from low and often chronic levels of radiation such as medical technologies, which are the fastest-growing source of radiation exposure. Emerging science is eroding central assumptions about radiobiology. Effects at repeated low doses are different and subtler than those from episodic high doses. And mysterious, intertwining, and sometimes contradictory phenomena hint at both serious health risks and surprising benefits: cells communicate extensively about exposures, taking radiation’s influence far beyond the genome; cancer may not be the only harmful consequence; low-level exposure may enable organisms to build up a tolerance that would protect them from high doses; and healthy cells can give radiation-damaged cells the equivalent of a death sentence to stop the threat of disease.
These enigmas, and the fact that individual responses to radiation exposure vary widely, mean Reference Man can’t represent two-thirds of the population: the very young, the very old, the overweight, the immune-compromised — not to mention Reference Woman. Exposure limits based on Reference Man set by federal agencies, along with guidelines from advisory organizations worldwide, have yet to catch up with the strange realities now being revealed.
Is Any Exposure Safe?
Well before Hiroshima or even Madame Curie, people were exposed to natural radiation from cosmic rays and rocks. In the 20th century, humans added to the load with fallout from weapons tests, nuclear power waste, and medical advances like dental X-rays, CT scans, and cancer treatments.
The plethora of radiation units measuring all these elements gets in the way of understanding what they mean. The activity, or amount of energetic particles emitted into the environment, is measured in curies and becquerels, but the amount of radiation energy absorbed by living tissue is expressed in sieverts. Named for the Swedish physicist who was a founder of the International Radiation Protection Association, the sievert scale is the closest thing to a measure of biological effectiveness available. For humans, outright radiation sickness (red skin, nausea, organ failure) starts at about 1.0 sievert. (In this article and accompanying graphics, doses have been converted to sieverts for comparison.)
The March-April 2012
This article appears in our March-April 2012 issue under the title “Could Radiation Actually Be Good For Some of Us?” To see a schedule of when more articles from this issue will appear on Miller-McCune.com, please visit the
March-April magazine page.
Radiation exposure is cumulative over a lifetime, so limits on how much additional radiation it is safe to receive in any time period are set low as a precaution. Everyone in the U.S. is exposed to about 6 millisieverts a year from the combination of medical procedures, natural sources, and remnant radiation from the weapons testing era; by the age of 10, most people will have accumulated enough exposure from all sources to be at some risk.
The EPA’s and Nuclear Regulatory Commission’s current annual exposure limit is 1 millisievert — in addition to the annual estimate of 6 millisieverts mentioned above, so ideally people won’t be exposed to more than 7. But those who actually work around nuclear materials will likely reach or exceed that exposure, and then the precautionary principle becomes surprising pliable: the International Commission on Radiological Protection (which is different from Sievert’s organization) recommends an annual occupational limit of 20 millisieverts, while the NRC sets that limit at 50 millisieverts.
To arrive at these numbers, regulators and international agencies like the ICRP follow what’s called the linear no-threshold hypothesis. This says that any exposure above zero always creates some risk of harm, and each added increment of dose adds a corresponding increment of risk. The paradox is that regulators have to settle on some exposure limits to protect workers and the public, and these limits imply that below those lines, there is a safe zone — even while they insist there is no threshold of exposure that’s truly risk free.
Recent research has exposed how untenable this position is but has not yet offered much clarity or comfort.
While we are getting better at seeing radiation’s effects on individual cells, it remains difficult to predict the effect of any particular dose on a population. The conventional wisdom regarding what happens to people at exposures below 100 millisieverts has been essentially conjecture; many experts think exposures in that zone pose no harm. But epidemiological studies show there is wide variation in people’s sensitivity to radiation: for example, at 100 millisieverts — a tenth of the dose causing outright radiation sickness — women have about a third higher risk for cancers of solid organs than men, whereas men are significantly more at risk for leukemia, according to the National Academy of Sciences committee that is charged with recommending protection measures. Further, discussions of effects on populations focus on average dose, which doesn’t reflect what actually happens to individuals. “If you gave 1 millisievert to two people standing next to each other and looked at the total energy depositions in their cells,” says Keith Baverstock, a former radiation protection official at the World Health Organization, one might get pummeled and the other escape altogether, yet the average dose to that population would remain the same.
Based on emerging science, Baverstock considers 100 millisieverts “a high dose.” It takes about that much to destabilize a cell’s genome, suggested Mary Helen Barcellos-Hoff of the New York University School of Medicine, and colleagues, in 2008. And this year, German researchers Hagen Scherb and Kristina Voigt said that the ICRP’s risk models lead to “considerably underestimated health risks” at exposures under 100 millisieverts.
Scientists do agree that in our most sensitive population — fetuses — doses as low as 6 to 10 millisieverts are harmful. In 1958, pioneering British epidemiologist Alice Stewart found a 40 percent added risk of leukemia in children whose mothers had abdominal X-rays in that range while pregnant. Scherb and Voigt estimated in 2007 that prenatal exposure at the rate of 1.5 millisieverts a year could cause birth defects and even stillbirths. Limits formerly considered safe seem less and less so.
The Sociology of Cells
Beneath researchers’ and regulators’ struggle with dose and response, a deeper layer of scientific bedrock is eroding: the notion that a cell’s nucleus — considered the master control of cellular function and fate — is the sole target of radiation, that DNA is the thing that gets broken, and that cancer is the only result. This is the so-called “central dogma” of radiation health effects.
But as researchers observe individual radiation paths in single cells, they are finding that this approach is too narrow. Radiation can trigger the formation of oxygen compounds known as free radicals in the cytoplasm surrounding the nucleus and induce gene mutations inside the nucleus, even in cells that haven’t been directly hit. Cancer is not the only possible result; the use of radiation as a cancer treatment has made it clear that such therapy increases the risk of cardiovascular disease. There are also hints that radiation may contribute to endocrine, metabolic, and nervous system disorders.
Cancer remains a major concern, of course. Ominously, even if a low dose of radiation doesn’t immediately cause cancer, it can cause nonmalignant cells to behave like cancer cells. In September 2003, Barcellos-Hoff reported that nonmalignant cells may be pushed over the brink into a state called dysplasia, a disorganized and pre-cancerous state that could then lead to cancerous growth and invasiveness by radiation doses as low as 100 millisieverts — which you may recall had for years been the top end of “safe” exposure.
Even genetic theory is being jostled. Cells reproduce by cloning themselves, and we know radiation can damage this process. Based on a molecular biological perspective, the central dogma treats the cell as passive; it’s just a gene’s way of making another gene, as British evolutionary biologist Richard Dawkins has said. In this view, a radiation-damaged gene remains dysfunctional in exactly the same way through generation after generation of cells, until — if the mutation is lethal — it kills off the organism carrying it (human or otherwise).
But at low-dose exposures, explains Carmel Mothersill, a radiation biologist at Canada’s McMaster University, a new phenomenon emerges: genomic instability. A cell that has been exposed to radiation can appear undamaged, but can transmit harmful information to its cloned offspring. The cell’s descendants can become malignant or otherwise malfunction, but the effect jumps all over the genome, from gene to gene, unpredictably. It’s not known why.
Could seemingly normal cells sending messages to future generations be their way of finding another way to preserve their cell line and re-stabilize their genomes down the road?
Baverstock takes the radical view that the mutations seen in genomic instability are “a random set of damages that the cell does to itself.” When a cell’s normal functions are disrupted, for example, by radiation, he says, the cell “experiments, and if it finds a solution, it grows, and if not, it dies.” This active role flies in the face of the “selfish gene” theory and the passivity currently attributed to cells by the central dogma.
Another issue to consider when cells reproduce is epigenetics, the ever-changing processes that manage the DNA double helix. Normally various molecules attach at points along the helix to control the switching on and off of genes, which is involved in every cellular process. Diet, chemical exposures, and other influences may alter epigenetic patterns and lead to disease. There is ample evidence that radiation affects epigenetic processes. For example, Alexandra Miller of the Armed Forces Radiobiology Research Institute in Bethesda, Maryland, has done extensive work on uranium toxicity, including on depleted uranium. In research published in 2005, Miller exposed mice to depleted uranium and tracked the incidence of leukemia by looking at their spleens, where blood cells develop. The spleen cells showed reduced DNA methylation, an important epigenetic regulator implicated in many cancers when it goes off track. And epigenetic effects may be just as important in reproduction as the genes themselves; a maladaptive epigenetic pattern may transmit disease or at least its susceptibility to future generations.
Radiation’s influence on cell communications is even more intriguing. Cells communicating with an irradiated cell show internal changes similar to the exposed cell — even though they have not been hit themselves. This “bystander effect” is widely acknowledged, but its actual consequences are less clear. If bystander cells are copying the damage from an irradiated cell, this might trigger widespread and possibly malignant tissue damage. But, says Mothersill, the bystander effect may be a kind of alarm: a group of cells is alerted to radiation damage and acts collectively, sometimes instructing affected cells to commit suicide to clear up the problem. “If you have a nice functional bystander response, you’re probably better off,” observes Mothersill.
Mothersill’s research team demonstrated bystander communication not just between cells but also between organisms. In experiments conducted in 2006, Mothersill irradiated a group of rainbow trout in a laboratory tank, then added nonirradiated trout. In a second experiment, she irradiated a group of fish, then removed them from the tank and replaced them with healthy, nonexposed fish. In both experiments, the nonirradiated fish displayed bystander effects from radiation.
This “medium transfer” seems spooky — or was it simply a chemical left in the water by the zapped trout? This is plausible, since cells are well known to be chatty, using many different chemicals to exchange information about and respond to the environment, often as a group. With the bystander effect, scientists don’t yet know how the information is transmitted, but it is likely to be consistent with other evidence of complex cell signaling.
Another potentially positive phenomenon is adaptive response, which says essentially that “forewarned is forearmed.” Proponents of adaptive response, or “hormesis,” say that mild radiation exposure can tune a cell’s defenses, protecting it from later, higher exposures. For example, in a 2008 analysis of several human and animal studies, Bobby Scott of the Lovelace Respiratory Institute in Albuquerque concluded that low-dose gamma radiation prevented lung cancer in lab rats that had inhaled plutonium, a potent carcinogen.
“Hormesis is the only thing I know that’s provided a quantitative estimate of plasticity — how much the system can adapt,” says Edward Calabrese, a toxicologist at the University of Massachusetts at Amherst, and a pioneer of adaptive response research.
There’s no question that cells can repair some post-exposure radiation damage, as long as they aren’t hit too hard too fast. Yet hormesis, which acts preventively, has triggered heated dispute among researchers and activists because it undermines the rationale supporting radiological protection measures. On one hand, Canadian nuclear industry consultant Jerry Cuttler has accused epidemiologists of manipulating low-dose data “to obscure evidence of radiation hormesis” in order to frighten the public away from nuclear power. On the other hand, Daniel Hirsch, president of the anti-nuclear group Committee to Bridge the Gap, calls hormesis “frankly, John Birch Society stuff.”
Mothersill takes a more moderate view: “To say low doses are good for you is an oversimplification.” Clearly a good outcome at one level can be a bad one at another: a cancer cell made more robust by hormesis could be fatal for an organism. Hormesis also tends to wear off after a few cell generations.
Shifting the Paradigm
For half a century, the theoretical strictures of genetics and molecular biology have kept a tight grip on research throughout the life sciences, including radiobiology. But what used to be muttering in the ranks about the central dogma is now audible dissent. In 2009, Baverstock, Mothersill, and 15 other researchers signed a declaration saying that based on the emerging evidence, low-level radiation exposure can lead to cardiovascular, immune, central nervous system, and reproductive diseases. The signatories also asserted that the ICRP’s risk model, which many countries adopt as their own, may be underestimating actual risk by a factor of 10 or more.
In search of a better approach, many radiation biologists are adopting systems theory, which posits that systems are more than the sum of their parts, and that unique properties may emerge at each level of organization, from proteins to populations. Under this holistic banner, Baverstock organized a conference in October 2011 to address the need for a broad revision of the theoretical approach to low-level radiation health effects.
If we were all Reference Man, perhaps our robustness would allow us to disregard low-dose risks. But we’re not; we’re flawed, vulnerable individuals. The human environment and myriad products we use every day expose us to hundreds of chemicals that disrupt biological processes and whose cumulative consequences are poorly understood. Radiation is another wild card added to the mix, potentially affecting everything from allergies to procreation, and little is known about how it interacts with these other influences.
Clarity may be a long time coming. The National Academy of Sciences committee that reviewed low-level effects in 2006 acknowledged the array of evidence but chose to stand upon the linear no-threshold rationale because the science remains unsettled. The Environmental Protection Agency has followed suit.
Nonetheless, emerging science illustrates the strength and elegance of life’s ability to adapt to daunting challenges — its imagination. The hints and glimmers of hormesis, genomic instability, the bystander effect, and epigenetics suggest there may be a new paradigm coming, though at the moment we see it as through a glass darkly.