This article was originally written for Radioactive Times in 2008. I didn't set out to write the whole history of radiation protection - just to highlight the turning point when the bogus concept of absorbed dose was foisted on the world.
Absorbed doses of ionising radiation are defined as an average of the energy that is transferred into large volumes of body tissue. This approach is valid for considering external exposures, like X-rays or natural gamma (cosmic rays) but not for situations where radioactive substances inside the body irradiate microscopic volume of tissue selectively. Particles of Uranium and Plutonium are examples; the range of their alpha emissions is so tiny that all the energy is concentrated into a few hundred cells. Some call this kind of situation "pinpoint radiation". Using absorbed dose to assess the potential health damage is like a doctor examining a child whose skin is covered with small red marks.
Now look, Mrs. Smith, I'm a doctor and I'm telling you even if your lodger does stub out his cigarette on little Nelly's tummy there's no problem because she absorbs very little energy from it. You give her a far bigger dose when you put in her a nice warm bath.
The trick was pulled in the depths of World War 2, subverting the science of radiation protection in order to protect the Manhattan Project and the A-bomb; it has served to protect the nuclear industry ever since.
Until the 1920s the main focus of radiation protection was external X-rays, but the Radium dial painters' scandal made it obvious that internal effects needed specific investigation. This led to new standards determined by looking at the actual effects of radium in the dissected tissues of people.
Radium is produced by the radioactive decay of natural Uranium. Its own radioactive decay emits alpha particles. Unlike X-rays and gamma rays, alphas have very little penetrating power so they are only hazardous once they're inside the body. Even then they don't travel far but the downside is that all their energy is deposited in a very small volume of cells.
From the earliest years of the 20th century luminous Radium paint was applied to the faces of clocks, watches and compasses to make them glow in the dark. World War 1 boosted demand and through the following decades hundreds of girls and women were employed to paint dials and pointers with various brands of paint - Undark, Luna and Marvelite. They would routinely put the tips of their paint brushes between their lips to obtain a fine point for the trickier numerals. By 1923 it was clear that the Radium they thus ingested was causing dreadful, agonising and frequently fatal illnesses.
Radium mostly lodges in bone, so the diseases affected the blood-forming function of the women's bone marrow, leading to anaemia. Those with higher body burdens had ulcers and their bones were weakened to the point where vertebrae collapsed and legs would break spontaneously. The first deaths directly attributed to Radium Necrosis came in 1925. The inventor of the Undark brand died like his workers, his bone marrow destroyed and his hands, mouth and jaw bones eaten away. Court cases, compensation payments and improved workplace practices followed (a ban on licking brushes was the first) but for a decade and a half there were no mandatory exposure limits.
By 1941 America was once more tooling up for industrialised warfare and the government was ordering large numbers of luminized instruments. By that time the global total of Radium extracted from the earth's crust was only 1.5 kilograms but, already, the deaths of more than a hundred people were attributable to its processing and use. Officials insisted that safety standards be devised, including a tolerance limit for internal Radium. A committee of the National Bureau of Standards looked to a post mortem study of Radium dial painters and people who had been exposed to Radium through medical treatments. They saw that there were detectable injuries in all the bodies which contained a total of 1.2 micrograms of Radium but no injuries were discernible in those containing 0.5 micrograms or less. The committee settled on 0.1 micrograms as a cut-off. The history books show they knew this was a highly subjective stab in the dark.
Since Radium decays to Radon gas officials were able to use Radon as an indicator for metering. From then on, Radium workers were required to breathe into an ion chamber which detected the radioactive decays of Radon and its own daughter, Polonium. An immediate change of occupation was recommended as soon as the level indicated that a worker's body contained more than 0.1 micrograms of Radium.
World War 2 was midwife to the principle of nuclear fission, a completely novel substance - Plutonium - and the possibility of a Plutonium-powered bomb. The Manhattan Project was set up to make Plutonium for the bomb in secret and in near total ignorance of its effects on health. It was known to be an alpha emitter so, for expediency, the standards for Radium were extended to Plutonium, modified by animal experiments comparing the effects of the two substances.
All this - both the Radium standard and the Plutonium standard derived from it - was primitive science which had no way of detecting subtle lesions and cancers which may take decades to appear. The discovery of the double helix structure of DNA was still a decade away and for another 50 years no-one suspected the existence of epigenetic effects (genomic instability and the bystander effect). So the safety standards were unlikely to reflect long-term health effects but they did have the huge philosophical advantage of being rooted in reality; the Radium researchers had followed the essentially scientific principle of looking for a relationship between cause and effect. Maybe this was because they were medical practitioners, campaigners for workers' rights and newspapers eager for the human interest angle on any story. Maybe their investigation enjoyed some liberty because the dial painting industry was owned privately, rather than by any government, and because at that time the fate of the "free" world did not seem to hang on the outcome.
By 1944 everything had changed. Plutonium was being produced in significant amounts and any potential it might have to kill its own workforce now affected a top-level policy funded by a bottomless budget with the imperative of building the bomb before Stalin could. More crucially for the scientific principles of radiological safety, physicians were no longer in charge, but physicists.
The agent of change was a British physicist, Herbert Parker, head of radiation protection at the Manhattan Project. His earlier career in British hospitals had made him familiar with X-rays and a kind of therapy that used Radium as an external source, confining it in tubes and placing it carefully to irradiate cancerous tissues. (This medical application had been tried as early as 1904, only six years after Radium was discovered. In marked contrast to the dial painters' problems, it didn't involve Radium becoming inextricably mingled with a patient's bones.) Parker had a physics-based view; radiation was a single phenomenon, whether it came from an X-ray machine or a speck of Plutonium. As with light, where the physicist isn't too interested in whether the source is a light bulb or the sun, Parker was concerned with how much energy the radiation delivered to the tissue of interest. The language here is of ergs, from the Greek for work. It is defined in dynes, the Greek for force; the units are physical - movement, velocity, grammes of mass, centimetres of length, seconds of time.
Parker was one of the first to call himself a Health Physicist. In his world there was no call for a bedside manner.
Using his physicist's approach, Parker shifted the focus from direct investigation of the effects of specific substances onto a new concept - radiation dose - which he could apply to radiation from any source and all sources, providing a way to assess workers' total exposure to all the novel nuclides the Manhattan Project was now creating. He defined a unit of dose in ergs per gramme of tissue and called it the Roentgen Equivalent Physical, or rep. Its very name betrays the mindset; Wilhelm Roentgen was the discoverer of X-rays (for a long time they were called Roentgen rays). The source of X-rays is always outside the body, so we can see the understanding of dose, and hence risk, was now to be based on an external paradigm.
The first limit for Plutonium in the body based on Parker's dose model was set at 0.01 reps per day, a quantity which exactly matched the energy deposition from the old tolerance limit of 0.1 microgramme of Radium. No change there then. What did change was that instead of the empirical scientific inquiry based on actual tissue damage and instead of the tentative subjectivity of the 1941 Standards Bureau Committee's decision on a Radium level, the new model gave an impression of mathematical precision, certainty and universal applicability. This was the new, square-jawed and confident nuclear era where bombs of unimaginable power would biff the Red Menace into oblivion and unlimited atomic energy would fuel everything in a world of peace and plenty.
Any risk model needs two types of data - for exposure and for effect. Unfortunately, there were no reliable data even for X-rays despite 50 years' experience. There was too much variability in the machines and the conditions in which they were used; doses were largely unknowable and many of the long-term effects had yet to emerge. But after 1945 the surviving people of Hiroshima and Nagasaki provided the authorities with a fresh opportunity. Funded and controlled by America, data on the survivors' health was gathered (as it still is) in what have become known as the Life Span Studies or LSS.
A full analysis of the flaws in the LSS is beyond me. As far as studying internal radioactivity is concerned the flaw is fatal; the control population providing the base-line of expected rates of disease, to be compared with disease in the exposed population, was recruited from the bombed cities themselves - they had either been outside the city when the bomb fell, or in some other way were shielded from the flash of the explosion. The "exposed" population consisted of people who had been in the open and so received a large dose of external gamma rays. But both groups ingested and inhaled just as much fallout as each other, so the LSS are totally silent on internal radiation. The only difference between them was the external irradiation. LSS nevertheless is the basis of radiation protection standards all over the world to this day for both external and internal.
The LSS were not begun until 1950 (another flaw, obviously, because by then many of the most susceptible people had died) but already, in 1948, America's Atomic Energy Commission had pressed the National Council for Radiation Protection (NCRP) to develop safety standards for the growing nuclear industry. An especial concern was the quantity of novel elements which, being alpha emitters, would present internal hazards. Separate sub-committees addressed internal and external radiation. The external sub-committee completed its work quite quickly but the other was slowed down by the many complexities of internal contamination. The problem is that physicists don't have much clue about where radioactive elements go once they are inside the body, how long they stay there or what biological damage they're doing. Impatient with the delays, NCRP's Executive closed down the internal committee in 1951 and stretched the report of the external committee to cover internal radiation. Karl Z. Morgan, chair of the internal radioactivity sub-committee, refused to agree that internal could be dealt with like external. For the rest of his life he was a critic of official radiological protection bodies –
I feel like a father who is ashamed of his children.
In 1950, American influence revived the International X-ray and Radium Protection Committee (IXRPC), which had been dormant during the war. In fact only two of its members were still alive and one of those was an American who was Chairman of the American NCRP. But needs must, and an international body would probably look more credible than a unilateral American one, so IXRPC was reborn as the International Commission on Radiological Protection (ICRP). In reality ICRP was just an overseas branch of the NCRP and in 1953 it adopted the NCRP report wholesale.
An epilogue is a short speech at the end of a play. In the case of this drama it's hard to be brief. I'll give two snapshots - one is global, the other is a family tragedy.
In 1986 the accident at Chernobyl spread fallout round the whole planet and millions of people inhaled and ingested it. Thousands of published reports from Russia, Belarus, the Ukraine, Greece, Germany, Britain, and even as far west as the Californian coast show a wide range of post-accident health effects not predicted by ICRP's model. In 2007 ICRP adopted new Recommendations in which there is a single reference to one study of Chernobyl. It's a paper on thyroid cancer. They cite it for the sole purpose of establishing that it's so hard to be sure what doses the patients had got from the fallout that the accident can tell us nothing useful. ICRP clings so hard to the dogma of dose that they are willing to rob the human race of the chance to learn about the results of the worst ever reactor accident (I wrote this before Fukushima).
This is one among millions of similar stories, but enough detailed information has leaked out to let us learn from it.
In May 2007 The Guardian (linked here or here) and The Times carried reports of a Cumbrian woman’s shock at finding out what had happened to her father 36 years earlier.
Angela Christie's father, Malcolm Pattinson, died of leukaemia in 1971. He was 36 years old and he worked at Sellafield. Or he had worked there; the Times reported that by the time he died he had been off work for 18 months because his wife feared for his health. As soon as he was dead his employers made frantic efforts to obtain organs and bones from his body. The local coroner, doctors and solicitors were involved but the family was neither consulted nor informed. In 1979, after a long battle during which the employers admitted liability, an out-of-court settlement brought Mr. Pattinson’s widow and daughters compensation payments variously reported as £52000 and £67000.
All this happened when Malcolm’s daughter Angela was in her teens. She grew up and went to work at Sellafield like her father. She married and had three children of her own. Then she read in a newspaper that her father had been one of many men in the industry whose organs had been harvested for radiological research. She asked for the legal papers and received several boxes full.
They're quite shocking, which may indicate why Mr Pattinson's employers were so interested in snatching his body parts. His liver contained 673 times as much Plutonium as the average for a sample of Cumbrians who had not worked in the nuclear industry and his lungs had well over 7000 times as much. His liver had 53 times the amount of Plutonium found in the most contaminated of the nuclear workers in other reports and his lungs had 42 times as much. Mr. Pattinson's body burden was far greater than any other worker data I have seen. I conclude that he had either been involved in an accident or had been working in an unacceptably dirty environment. Either would be a scandal, but the far wider scandal is that the industry and the government would not see even those monstrous levels as a likely cause of his death.
From the data published in the Guardian I calculated the radiation dose Mr. Pattinson received from his body burden of Plutonium. Using the same methods as the ICRP I worked out the annual dose at 26 milliSieverts. That's about ten times the usual (bogus) yardstick of natural background but it would have been nothing very remarkable in the early 1970s. Even today, when standards are more cautious, employers would still not be breaking the law by exposing a worker to such a dose so long as it wasn't for more than one year in five.
ICRP's risk estimates would not predict that a 26mSv dose would cause Mr. Pattinson's leukaemia, in just the same way as they do not predict the cluster of childhood leukaemia at Seascale, next door to Sellafield — the doses are far too low. According to ICRP, if Mr. Pattinson was going to die of any cancer, the chance that it would be caused by the Plutonium in his body was only 1.3 in a 1000.
To the person in the street the idea that fatal leukaemia in a young man is 770 times more likely to be caused by bad luck, bad genes, bad diet, smoking, a virus or an act of God than by the acts of an employer who contaminated him heavily with a bone-seeking, alpha-emitting radionuclide may seem insane. It is insane. It is insane in the way Dr. Strangelove was insane; the logic is impeccable but the theoretical premises are wrong. The good news is that growing numbers of scientists are recognising that ICRP is in error. These include Jack Valentin, the man who recently retired as ICRP's Scientific Secretary.
Low Level Radiation Campaign