A new look at prolonged radiation exposure

MIT study suggests that at low dose-rate, radiation poses little risk to DNA.

A new study from MIT scientists suggests that the guidelines governments use to determine when to evacuate people following a nuclear accident may be too conservative.

Best of 2012

The study, led by Bevin Engelward and Jacquelyn Yanch and published in the journal Environmental Health Perspectives, found that when mice were exposed to radiation doses about 400 times greater than background levels for five weeks, no DNA damage could be detected.

Current U.S. regulations require that residents of any area that reaches radiation levels eight times higher than background should be evacuated. However, the financial and emotional cost of such relocation may not be worthwhile, the researchers say.

“There are no data that say that’s a dangerous level,” says Yanch, a senior lecturer in MIT’s Department of Nuclear Science and Engineering. “This paper shows that you could go 400 times higher than average background levels and you’re still not detecting genetic damage. It could potentially have a big impact on tens if not hundreds of thousands of people in the vicinity of a nuclear powerplant accident or a nuclear bomb detonation, if we figure out just when we should evacuate and when it’s OK to stay where we are.”

Until now, very few studies have measured the effects of low doses of radiation delivered over a long period of time. This study is the first to measure the genetic damage seen at a level as low as 400 times background (0.0002 centigray per minute, or 105 cGy in a year).

“Almost all radiation studies are done with one quick hit of radiation. That would cause a totally different biological outcome compared to long-term conditions,” says Engelward, an associate professor of biological engineering at MIT.

How much is too much?

Background radiation comes from cosmic radiation and natural radioactive isotopes in the environment. These sources add up to about 0.3 cGy per year per person, on average.

“Exposure to low-dose-rate radiation is natural, and some people may even say essential for life. The question is, how high does the rate need to get before we need to worry about ill effects on our health?” Yanch says.

Previous studies have shown that a radiation level of 10.5 cGy, the total dose used in this study, does produce DNA damage if given all at once. However, for this study, the researchers spread the dose out over five weeks, using radioactive iodine as a source. The radiation emitted by the radioactive iodine is similar to that emitted by the damaged Fukushima reactor in Japan.

At the end of five weeks, the researchers tested for several types of DNA damage, using the most sensitive techniques available. Those types of damage fall into two major classes: base lesions, in which the structure of the DNA base (nucleotide) is altered, and breaks in the DNA strand. They found no significant increases in either type.

DNA damage occurs spontaneously even at background radiation levels, conservatively at a rate of about 10,000 changes per cell per day. Most of that damage is fixed by DNA repair systems within each cell. The researchers estimate that the amount of radiation used in this study produces an additional dozen lesions per cell per day, all of which appear to have been repaired.

Though the study ended after five weeks, Engelward believes the results would be the same for longer exposures. “My take on this is that this amount of radiation is not creating very many lesions to begin with, and you already have good DNA repair systems. My guess is that you could probably leave the mice there indefinitely and the damage wouldn’t be significant,” she says.

Doug Boreham, a professor of medical physics and applied radiation sciences at McMaster University, says the study adds to growing evidence that low doses of radiation are not as harmful as people often fear.

“Now, it’s believed that all radiation is bad for you, and any time you get a little bit of radiation, it adds up and your risk of cancer goes up,” says Boreham, who was not involved in this study. “There’s now evidence building that that is not the case.”

Conservative estimates

Most of the radiation studies on which evacuation guidelines have been based were originally done to establish safe levels for radiation in the workplace, Yanch says — meaning they are very conservative. In workplace cases, this makes sense because the employer can pay for shielding for all of their employees at once, which lowers the cost, she says.

However, “when you’ve got a contaminated environment, then the source is no longer controlled, and every citizen has to pay for their own dose avoidance,” Yanch says. “They have to leave their home or their community, maybe even forever. They often lose their jobs, like you saw in Fukushima. And there you really want to call into question how conservative in your analysis of the radiation effect you want to be. Instead of being conservative, it makes more sense to look at a best estimate of how hazardous radiation really is.”

Those conservative estimates are based on acute radiation exposures, and then extrapolating what might happen at lower doses and lower dose-rates, Engelward says. “Basically you’re using a data set collected based on an acute high dose exposure to make predictions about what’s happening at very low doses over a long period of time, and you don’t really have any direct data. It’s guesswork,” she says. “People argue constantly about how to predict what is happening at lower doses and lower dose-rates.”

However, the researchers say that more studies are needed before evacuation guidelines can be revised.

“Clearly these studies had to be done in animals rather than people, but many studies show that mice and humans share similar responses to radiation. This work therefore provides a framework for additional research and careful evaluation of our current guidelines,” Engelward says.

“It is interesting that, despite the evacuation of roughly 100,000 residents, the Japanese government was criticized for not imposing evacuations for even more people. From our studies, we would predict that the population that was left behind would not show excess DNA damage — this is something we can test using technologies recently developed in our laboratory,” she adds.

The first author on these studies is former MIT postdoc Werner Olipitz, and the work was done in collaboration with Department of Biological Engineering faculty Leona Samson and Peter Dedon. These studies were supported by the DOE and by MIT’s Center for Environmental Health Sciences.

Topics: Biological engineering, DNA, Energy, Genetics, Measuring radiation, Nuclear power and reactors, Nuclear science and engineering, Center for Environmental Health Sciences (CEHS), Department of Energy (DoE)


The arguing I hear is mostly I the other direction: that large dose studies whether in the lab or facilitated by meltdowns or actual atomic explosions may understate what some epidemiologist types think they see as cumulative effects of multitudes of small doses.
I've never once heard that put forward by a rational person or someone in the field. Although we're data starved, all evidence (like people who live in much higher background levels and studies like this) suggest that we overestimate low dose threats. I'm willing to guess that whoever was selling you that claim had an anti-nuclear agenda.
Fukushima people are scared to radioactivity. However, the paper of MIT, gave courage to Fukushima.
Unfortunately we get cancer without high radiation levels. So, it's just means we cant measure risk of cancer by this way, I think. Does radiation lisk separate from other cancer risks? Linear no-threshold model is more reasonable.
Studies of the mortality of atomic bomb survivors, Report 14, 1950-2003: an overview of cancer and noncancer diseases. Ozasa K. et. al. http://www.ncbi.nlm.nih.gov/pubmed/22171960 This large-scale epidemiological study on cancer risk of low-dose exposure in human has shown that there is no threshold. Small-scale experiments in mice are not enough to discuss the impact on human.
According to a recent and official investigation by Fukushima prefecture, 35% (=12646/38114) of children have already shown thyroid anomalies (cysts and nodules). I wonder how the result is compatible with the above report.
Can you provide a link to those results?
For decades the US nuclear industry has estimated radiation risk and mitigation measures based on the "linear no-threshold" hypothesis that every milliSievert radiation dosage increases the chance of cancer proportionally. This method of calculating risk and mitigation is known to be overly cautious, because (just one of many examples) US citizens living in areas with much higher background radiation have not developed the higher cancer rates predicted by the model. Other nations have followed the US example in risk calculation. Studies like this study are extremely important for learning more about the real risks of low-level radiation--that the risks are lower than generally believed. One effect of over-estimating radiation risk has been to stop the US and slow European nuclear industry since 1980, with major consequences including accelerating climate change-- a staggeringly high cost for being too cautious about radiation. Let's have more data and less fear of the unknown.
This data is not useful without a control population. For example, another 38000 Japanese children from other parts of Japan with equivalent diet and economic conditions, but no risk of exposure to Fukushima, would have to be given the same investigation and tests. Very likely the control population would show no statistically significant difference from the Fukushima population. But such control studies are expensive, and therefore are rarely done. Only when people have significant fear for their health do they spend money to examine their health in such detail, and then they often will find harmless but puzzling anomalies. As an example, US doctors recently published papers asking whether testing women for breast cancer at age 40 does more harm than good: many benign anomalies are detected, and many unnecessary surgeries are performed, with few malignant cancers actually prevented. Instead, the doctors say, data suggest waiting until age 50. Seek and ye shall find.
This study shows no genetic damage to mice exposed to 120 microsieverts per hour for five weeks. It bears little relevance to the effects on humans exposed to lower dose rates, like 1 microsievert per hour in Fukushima, for much longer periods, let alone multiple generations. Nor does it take into account other potential health impacts such as weakened immune system. I think it's irresponsible for the announcement to conclude from this study that a higher dose rate for humans should be permitted before evacuation.
I agree with all you said except for the last sentence. I would agree with that too, if the authors had done what you said. But, they didn't. The introductory paragraph uses the words "suggest" and "may." The announcement goes on to discuss why this study's result calls current policies into question. It never says the policies should be changed now, only that the study results suggest current policies may not be the best policies and more work should be done in this area.
This is potentially some of the most important biological research being done today. Curious that there has been such an historical dearth of studies on low-level biological effects of radiation. The DOE, NIH, CDC, & NAS should all be sponsoring additional research into this potentially revolutionary field at multiple research universities and national labs. Present BEIR LNT theory is non-empirical and thus by definition non-scientific. TD Lucky and Edward Calabrese have long documented observations of a bio-positive (radio-hormesis or hormeostatic) dose-response to low levels of chronic exposure to ionizing radiation peaking at ~100mSv/100mGy(0.1cGy)/yr. Multiple cohorts of mice should be exposed to a range of low-level radiation and studied throughout their lives (3-5 yrs); (e.g. 1Gy, 0.5Gy, 0.2Gy, 0.1Gy, 0.05Gy, and control). Then we should proceed on to primate testing. If a LNT-BEIR can be disproved it would revolutionize medicine and eliminate needless burdensome regulations.
The authors ignore the devastating heredity effects. After Chernobyl (1986) some districts in Germany got fall-out and other nearby districts not (local rain). Alls districts had improved birth registration since ~1980. The enhanced radiation levels of only 0.5mSv/a delivered a ~30% sudden increase in Down syndrome, Spina Bifida, stillbirth, etc. in newborn after Chernobyl. That occurred only in the districts with fallout! http://www.helmholtz-muenchen.de/ibb/homepage/hagen.scherb/CongenMalfStillb_0.pdf
Research after Chernobyl clearly shows enhanced levels of stillbirth, congenital malformations, neural tube defects, etc. after enhanced levels of only 0.5mSv/a. E.g. http://tchie.uni.opole.pl/ecoproc10a/ScherbVoigt_PECO10_1.pdf
This is important work and as many have commented, much more is needed. My statistical analysis of Hiroshima Leukema vs dose data indicates: a)%Probability=0.2+0.4*(Dose^2 in Sv^2) b)A linear model is excluded at 99.5% confidence. On the issue of Chernobyl induced birth defects, I intend to dig into the references for statistical significance. In many alarmist publications, the rise in mortality at that time is more attributable to the economic disintegration and rise in alcholism that was precipitated by the politcal collapse in eastern Europe. The large number of Thyroid cancers (~4,000 extra cases with expected 5% mortality if treated) in Bellarus were certainly attributable to Chernobyl and the absence of remedial actions such as distributing Iodine pills and banning milk product consumption. Other problems were: all safety mechanisms were gronked; no concrete containment; and the initial coverup (news blackout).
Back to the top