For Andrew Maidment, the chief of physics for the Department of Radiology at the University of Pennsylvania, it was one of the oddest phone calls he’d ever received. “A woman called me up once,” he remembers, “and I hear this child screaming in the background, yelling and crying, crying. She starts off with, can I hug my son again?”
Part of Maidment’s job is consulting with patients and occasionally answering strange questions, but this was a new one: “She explains that her son had broken his leg three weeks earlier, and he’d had an X-ray for his leg. And was he still radioactive? She’s so afraid of radiation she won’t hug her kid.” Maidment reassured the woman that any radiation her son had received vanished as soon as the X-ray machine was turned off, and that her child wasn’t radioactive. “She says, ‘so I could have hugged my son right away?’ And I say, ‘yeah.’”
It was an example of a phenomenon that scientists and other experts have dubbed radiophobia — the fear of ionising radiation. Every horror filmmaker worth her fake blood and monster makeup knows that nothing is more terrifying than the unseen: the creak on the stairs, the shadow on the curtains, the hint of fatal evil.
And radiation — a mysterious presence that can’t be felt by our senses — is just such a terror.
This fear has become the default setting for most people — and plenty of experts argue that’s for good reason. Ionising radiation strips electrons from atoms and molecules, converting them into ions, or charged particles. And in atoms and molecules comprising living tissue, such as DNA, it can wreak havoc. If the intensity and rate of exposure are too great, the organism sickens and can die. This is called acute radiation syndrome, or ARS, and it killed many people in the aftermath of the atomic bombings of the Japanese cities of Hiroshima and Nagasaki in the 1940s. It also took a toll following the nuclear accident at Chernobyl, which was recently revisited, to critical acclaim, in a haunting HBO miniseries. Fortunately, most living organisms rarely, if ever, encounter radiation at such toxic levels.
Crystallising events like these have reinforced in the popular mind a technical model that today informs the regulation of nuclear radiation the world over. It’s called the “LNT” model, short for “linear no-threshold,” and it holds that at any level of radiation exposure greater than zero, there is some damage to human DNA. The higher the level, the greater the harm. The “linear” part of the LNT refers to the shape of a line graph plotting cancer risk against radiation dose: not a curve, but a straight line beginning at zero and proceeding upward to infinity. According to the LNT model, all radiation doses, however small, are dangerous, and the harm is cumulative over time. Thus, any exposure must be held to what nuclear regulators would call an “as low as reasonably achievable” dose, or ALARA.
To most scientists, regulators, and the public at large, this only makes sense. And yet, an increasingly vocal group of nuclear experts suggests that the LNT model and its zero-sum approach to radiation may well be doing more harm than good. Naturally occurring radiation surrounds us all the time, they point out — emanating up from the Earth itself, and raining down on us from the cosmos. And outsized fear of radiation, they say — driven largely by the LNT framing — does much more than frighten people like the mother who queried Maidment. It drives voters and governments, for example, to abandon or avoid nuclear power — even though its fossil-based alternatives pollute the air and warm the planet, arguably killing far more people each year. (Indeed, according to the World Health Organization (WHO), 4.2 million people die every year from outdoor air pollution — orders of magnitude more deaths than those attributable to all civilian nuclear accidents, and the bombings of Nagasaki and Hiroshima, combined.)
Similarly, the partial meltdown of three reactors at Japan’s Fukushima Daiichi Nuclear Power Plant in 2011 led to mass evacuations that many experts now say resulted in more deaths than the earthquake and tsunami that caused the accident. “Overestimating radiation risks using the LNT model may be more detrimental than underestimating them,” argued researchers Jeffry Siegel and James Welsh in a 2015 article in the journal Technology in Cancer Research and Treatment, “as this approach has resulted in unnecessary loss of life due to traumatic forced evacuations, suicides, and unneeded abortions after the Fukushima nuclear accident.”
Are they right? Has the no-amount-is-safe approach to radiation — a paradigm that most of us consider so axiomatic that it informs both law and no shortage of fictionalised movie-house nightmares — actually made us less safe? Or, while perhaps technically tenuous, does it nonetheless provide a rational posture, given the potential risks? Jan Beyea, a nuclear physicist and the founder of Consulting in the Public Interest, a firm offering advice and expertise in the biological and physical sciences to nonprofits, universities, law firms, and other groups, suggests those are heated questions indeed. In a May 2012 article in Bulletin of the Atomic Scientists, Beyea wrote: “The debates can be brutal — so much so that, at times, they make the spats between William Jennings Bryan and Clarence Darrow look tame.”
The divisions run deep: “Underlying the [existing] risk models is a large body of epidemiological and radiobiological data,” the Environmental Protection Agency opined in support of LNT in 2011. “In general, results from both lines of research are consistent with a linear, no-threshold dose (LNT) response model.” That assessment was echoed by the Congressionally chartered National Council on Radiation Protection and Measurements (NCRP) in April 2018. Reviewing more than two-dozen recent studies of the effects of low-dose radiation, the group concluded that the results added “substantial weight to the judgment on the use of the [existing] model for radiation protection.”
But Carol Marcus, a radiobiologist and nuclear medicine physician at UCLA who has recently petitioned the U.S. Nuclear Regulatory Commission (NRC) to relax some regulation of low-dose radiation exposures, disagrees. “The LNT,” she declared in her petition, “is based on hogwash.”
Ask most people how many were killed by radiation from the reactor accidents at Chernobyl, Fukushima, or Three Mile Island, and you’d probably get a wide range of answers. The 1986 Chernobyl disaster would no doubt be on top, with estimates ranging from a few thousand to hundreds of thousands. Fukushima, the first nuclear accident to occur during the era of 24-hour cable news and internet coverage, would probably draw estimates anywhere from a hundred up to thousands. And although most of us are aware that a serious disaster never occurred at Three Mile Island in 1979, some would probably guess that at least a few fatalities resulted, with leukemia or thyroid cancer cases manifesting themselves years later.
Radiophobia has been relentlessly reinforced by baleful portrayals in popular culture and the dark shadow of the mushroom cloud.
The actual figures are surprising. Most official sources give a figure of a few dozen deaths caused directly by radiation from the Chernobyl accident, though estimates of projected future deaths from cancer vary widely. That’s partly due to the unreliability and incompleteness of data from the former Soviet Union, as well as the impossibility of distinguishing cancers caused by Chernobyl from those from other causes. Projections range from around 4,000 (from scientific groups affiliated with international agencies such as the WHO and the United Nations) to 100,000 (from environmental organisations like Greenpeace). And while some research suggests that the rate of birth defects increased in areas contaminated by radiation following the meltdown, a lack of data on pregnant women’s actual exposure — along with other confounding factors like alcohol consumption and diet — make it impossible to draw definitive conclusions.
Fukushima is a considerably easier case. At the moment, only one radiation-related death has been reported — a Fukushima plant worker who died of lung cancer first diagnosed in 2016. As for Three Mile Island, there have been no official deaths from radiation.
This chasm between public perception and actual casualty data is a testament to decades of misconceptions, misinformation, and misapprehensions about radiation, reinforced by the relentlessly baleful portrayals in popular culture and the dark shadow of the mushroom cloud. From the earliest post-Hiroshima B-movies, comic books, and pulp novels, the underlying message has been the same: Radiation is bad. It can make humans and other creatures grow or shrink; give them special abilities; or and mutate them into monsters.
Radiation, of course, does none of those things, though it certainly does have devastating effects at high doses. It’s the over-regulation of even low-dose exposures, however, that LNT critics say have done more harm than good — and that have given rise to what they characterise as irrational fears. The major alternative to the LNT model posits the existence of a threshold, or a measurable limit, below which there’s no danger. This reflects the reality that in our daily lives, radiation can never be eliminated, only limited. “Radiation is ubiquitous,” says Chary Rangacharyulu, a physicist at the University of Saskatchewan. Every day, we are bathed in a natural low-level radiation background from various sources: cosmic rays, radon gas, radioactive elements in the Earth, even our own bodies.
Millions of years of evolution have provided earthly life with intricate genetic and cellular mechanisms that, in most cases, can either repair or neutralise damage by causing overly radiated cells to self-destruct. “The basic premise of LNT that any amount of radiation, however small, is harmful, is flawed,” says Rangacharyulu.
Or, as Siegel, Charles W. Pennington, and Bill Sacks wrote in a 2017 article in The Journal of Nuclear Medicine: “The primary LNTH [linear no-threshold hypothesis] fallacy is it excludes this evolutionary biology, ignoring the body’s differing responses to high versus low radiation doses. Low doses stimulate protective responses; high doses overwhelm and inhibit such protections.” In another article, Siegel and Pennington contend that “LNTH-derived policy is as unsafe a practice as shouting fire in a crowded theatre.”
If the LNT model is as flawed as these critics argue, how did it become the foundation of our regulatory framework, not to mention our cultural attitudes, regarding radiation? The first strong evidence of a connection between radiation and genetic damage came from experiments in 1926 and 1927 by the American geneticist Hermann Joseph Muller. Muller exposed groups of fruit flies to high doses of X-rays at high rates, reporting what looked to be definitive evidence of mutagenesis, or the production of genetic mutations. He eventually won a Nobel Prize in 1946 for this work, claiming in his Nobel lecture that there was “no escape from the conclusion that there is no threshold dose.”
Carol Marcus, a radiobiologist and nuclear medicine physician at UCLA, put matters more succinctly: “The LNT is based on hogwash.”
Further research in the interim since Muller’s early experiments, however, suggested that in fact there was a threshold at low doses or exposure rates. It remains unclear whether Muller merely dismissed these new findings as incorrect, or purposely ignored them. Edward J. Calabrese, a toxicologist at the University of Massachusetts Amherst, is the most vocal proponent of the latter view. He argues that close examination of the historical and scientific record proves that Muller was being intentionally “misleading and deceptive,” and tried to “blunt criticism of [his] Nobel Prize lecture and its impact on his reputation.”
But others have responded that Muller was an extraordinarily conscientious researcher and his unequivocal support of the LNT was based on his honest, if imperfect, appraisal of the scientific evidence of the time. In any event, Muller’s Nobel speech helped cement the scientific acceptance of the LNT model, and the 1956 recommendations of the National Academy of Sciences’ Committee on the Biological Effects of Atomic Radiation (BEAR I) made it official. Subsequent NAS committees combined the work of Muller and others on radiation mutagenesis with preliminary analyses of data for Hiroshima and Nagasaki to extend the application of the LNT to carcinogenesis.
This, the LNT’s opponents argue, was a major mistake. The possibility of huge doses of radiation causing genetic mutations in future generations, which was the core of Muller’s work, is one thing; causing cancer in living organisms with low radiation doses over time is quite different. In other words, a high dose of radiation all at once isn’t the same thing as small amounts over a long time.
Siegel, Pennington, Sacks, and Welsh have pointed out that “the difference between the lowest experimental dose [used by Muller] and zero dose is huge,” arguing that it’s “unacceptable to simply assume, without further observation,” that there’s some lasting damage all the way down to the lowest dose. As far back as the 16th century, it was recognised that while a substance might be harmless or even beneficial in small amounts, there is a threshold above which it becomes harmful. According to the Renaissance physician Paracelsus, known as the father of toxicology, “Poison is in everything, and no thing is without poison. The dosage makes it either a poison or a remedy.”
Nonetheless, as the New York Times science reporter Gina Kolata noted nearly two decades ago, “the idea that radiation’s effects were directly proportional to its dose caught hold and soon was being used to predict cancer cases.”