COMARE and South West Cancer Intelligence Service

Cancer in Burnham-on-Sea, Somerset
Our response to COMARE’s Statement on the Parents Concerned About Hinkley (PCAH) Questionnaire study

[Click on this link to see our original page on the PCAH survey]


This is a continuing debate, which COMARE (Committee on Medical Aspects of Radiation in the Environment) and other Quangos are conducting at the greatest possible distance from the public and organisations representing the public interest.

News

In response to corrections submitted by us and by PCAH in 2004, COMARE has altered its November 2003 Statement. The changes, however, render it internally inconsistent and address only one of COMARE's errors. We shall write asking them to face up to all our points.
The page below is our original response; with added observations on the present COMARE position (in red). Where no red ink appears our criticisms stand.
Click here to see new page on the nonsensical trap COMARE has dug for itself and fallen into


There are major problems with COMARE’s Statement (COMARE 2003)
  • it attacks PCAH’s Questionnaire Survey for not being a definitive study, whereas it was intended to be indicative; it could never have been anything else.
  • it makes a fatal error about the way the PCAH was conducted, assuming that 30% of households had responded to a 100% canvass, whereas in fact there was a 100% response to a 30% sample. COMARE allowed this error to remain for nearly a year although they were told about it repeatedly. In the current version of their Statement they have removed it.
  • it overlooks flawed analysis in the subsequent extended study conducted by the South West Cancer Intelligence Service (SWCIS 2003).
  • it overlooks the fact that once the flaws are corrected the SWCIS study confirms PCAH’s findings and the existence of elevated risks downwind of the contaminated mud banks.
  • it uses a selective subset of the SWCIS data.
  • it uses wrong significance testing.
  • it relies on a scientifically invalid assumption about the aetiology of chronic lymphocytic leukaemia.
1. Background
The Parents Concerned About Hinkley survey (Green Audit 2002) must be seen in its historical context if its nature and scope and the shortcomings of COMARE’s Statement are to be understood. In 1988 leukaemia studies for Somerset Health Authority (Bowie and Ewings 1988) suggested that the operation of Hinkley Point may have caused increases in leukaemia in its vicinity. A study of Gamma Ray backgrounds (NRPB 1988) showed that Gamma dose rates over Steart Flats were 3 times higher than average background inland; Burnham beach and Combwich beach were between 2 and 3 times higher. In 2000 mortality studies by us (Green Audit 2000) compared cancer mortality in all age groups from 1995 to 1998 for 103 wards within 30km radius of Hinkley Point nuclear power station.
  • All malignancy combined, breast, prostate and lung cancer were consistently high in the down-winders at Burnham on Sea. Breast cancer was 47% above expected; prostate cancer was 48% above.
  • In concentric arcs around the centre of the Steart Flats, risk was highest in the population living within a 5km radius. Thereafter, risk fell off with a trend similar to that found elsewhere for inland penetration of salt and radioactive particles in sea spray.
  • Risk was significantly higher in wards close to the rivers and on the low lying areas and was generally low in those wards above the 200m contour. This may reflect the general effect of rainfall causing concentration of radioactive material in the rivers.

In 2000 in response Somerset Health Authority (SHA) accused us of poor statistics and of using the wrong populations, i.e. the 1991 census data. Subsequently we checked our results using population figures for 1995-1999 supplied by SHA. There was still a 100% excess of breast cancer mortality and adding in the year 2000 data increased the effect. At this point SHA changed their ground: they conceded that there was indeed a significant excess mortality in Burnham North, but now said there was no evidence that it could be caused by Hinkley emissions. They made no apology for having rubbished our earlier figures and claimed that even they did not have access to incidence data because of patient confidentiality.

In March 2002, under the Data Protection Act, we discovered internal health authority papers which revealed that SHA did have cancer incidence statistics. In an email, the health authority also described a quick and dirty study they had put together to deny the excess in Burnham. But they had made a basic error leading to lower apparent cancer risks - they had applied population figures for the year 2001 to data covering 1988 - 97. The 1991 census was more appropriate; using the 1991 populations showed that the statistically significant excess was real.

area Observed '88- 97 Expected '88- 97 (1991 popns.) Excess Statistically significant?
Burnham North 61 46.9 30% yes
Burnham South 50 46.1 8% no
Burnham N & S 111 93 19% yes
Table 1: Cancer incidence risks 1988-’97 relative to appropriate 1991 populations

In August 2002 local people began to ask the health authorities for access to incidence data. The SWCIS kept aloof but the Somerset Coast Primary Care Trust (SCPCT) did meet us and PCAH, Stop Hinkley, and other members of the public and the press. For a while it looked as if there might be some co-operation but the Community Based Organisations (CBOs) realised that when campaigners address these issues with officialdom there is a large trust deficit. For this reason they asked for a joint study of cancer registration data with a study design to be agreed between epidemiologists acting for the health authorities and us acting on behalf of the CBOs. The PCT refused to work with us, arguing that to do so would compromise their independence. No data were released and PCAH decided to look for its own. A doorstep questionnaire survey unique in the UK was conducted in the north ward in Summer 2002 . It had almost 100% response covering 1,500 people - about 30% of the population.

Tables 4 and 5 below are taken from our analysis of the data collected by PCAH’s volunteers. They give the Relative Risks calculated by us for the cancers where there was some evidence of statistically significant excess risk and also for some common cancers for the two periods 1996-2001 and 1998-2001. For other cancer sites reported, the risk could not be calculated either because the numbers were too small or the data was too poor (the questions on the form having been vaguely or incompletely answered).

Cancer Observed Expected RR Poisson p-value
All malignancy 45 44 1.02 NS
Female breast 10 5.39 1.86 0.05
Kidney 4 0.84 4.76 0.01
Leukaemia 4 0.976 4.1 0.02
Cervix uteri 2 0.36 5.6 0.01
Colon 5 5.2 0.96 NS
Prostate 4 4.8 0.83 NS
Lung 4     Low
Table 4: Relative Risk (see text) for cancer in Burnham North from 1998-2001 indicated by PCAH survey (based on England and Wales rates for 1997).

Cancer Observed Expected RR Poisson p-value
All malignancy 64 66 0.97 NS
Female breast 16 8.09 1.98 0.01
Kidney 5 1.26 3.96 0.01
Leukaemia 4 1.46 2.73 0.05
Cervix uteri 3 0.54 5.55 0.01
Colon 6 7.8 0.77 NS
Prostate 7 7.2 0.97 NS
Lung 4     Low
Table 5: Relative Risk (see text) for cancer in Burnham North from 1996-2001 indicated by PCAH survey (based on England and Wales rates for 1997).

2. The above history makes it clear that the PCAH questionnaire survey was the citizens’ response to lack of trust and lack of information. It could not be an exhaustive study but was intended as an indicator of whether the excess cancer mortality was reflected in excess incidence and, if so, to push the authorities into action to protect public health. It succeeded - the Somerset Coast Primary Care Trust asked SWCIS to compile a full and detailed report on cancer incidence rates in the Somerset area. In October 2002 Tariq Malik, deputy Director SWCIS, said (in a letter to LLRC)
This report will contain all the data [Green Audit] have requested along with detailed analysis of coastal wards and regional variances. The report is currently undergoing independent statistical review prior to release and upon completion I am happy for a copy to be made available to you, for your own analysis.
Since the data have never been released this promise was empty. However we now have the report and have extracted some interesting information which confirms the risks identified in the mortality studies and the PCAH study. (See SWCIS results in para. 14 below.)
3. COMARE’s most serious error - a 30% survey, not a 30% response
The most important flaw in the COMARE Statement is that they assumed that the questionnaire was delivered to all the households in the Burnham North ward. In fact the questionnaire was only administered to people who opened their front doors when the volunteers called. So, far from the 30% response which COMARE alleges, PCAH had virtually a 100% response from a 30% sample. This makes almost complete nonsense of COMARE’s suggestion that the PCAH study is undermined by self reporting bias ( - the suggestion that people are more likely to respond to a survey of disease if they have had the disease than if they have not, leading to over-reporting.) COMARE has been told about this basic mistake in its position both in writing (PCAH 2004) and face to face (Busby to Professor Bridges 16th December 2003). COMARE should correct their mistake or explain why they do not. This error has been removed in the amended version currently on the COMARE web site. However, their review has been sloppy and incomplete - see paragraph 6 below for an explanation.
4. The PCAH questionnaire survey - what it was and what it wasn’t
The distinction between PCAH’s intentionally indicative study and a more definitive study has to be borne in mind, but COMARE’s Statement demonstrates a complete failure to understand the realities of public concern and grass roots action. It effectively damns chalk for not being cheese. CBOs generally have small resources which set limits on what can be achieved. More seriously, one must recognise that PCAH’s volunteers were doorstepping people with a very challenging issue. It takes considerable courage to do such a task. COMARE says (para. 4) that they do not wish to criticise either the idea of carrying out such a survey or the involvement of the local community but this does not sit with its criticism (COMARE 5(a)) that
... no attempt was made to validate the reported diagnoses by reference to medical records, or even to determine whether the respondents were referring to the primary diagnosis or to secondary sites to which cancer may spread, or to malignant or in situ or benign cancers.
In the same vein they complain that we used no independent medical and, in particular, histological verification of cancer diagnoses. These points are quite ridiculous - interrogating people about their responses would have been an intrusion too far, and CBOs can have no access to medical or histological records. If the authorities are serious about not wishing to criticise the principle of CBO health surveys they will have to undergo a large cultural realignment to facilitate co-work, such as the CBOs had already asked for, and the access to health and histological records which COMARE seems to be saying is essential.

5. Our study accepted that the PCAH survey was not the most accurate way of obtaining cancer incidence [data] .... (see Is the method biased Green Audit 2002 p. 11). We identified four types of problem

  • dead people not filling in questionnaires,
  • surviving family members moving away,
  • misreporting,
  • refusal to participate due to being too upset following a recent loss from cancer.
All these are "data leakage" factors and have the effect of reducing the apparent prevalence, causing the survey to underestimate risks. There were two instances of householders declining to participate because of distress over recent cancer diagnoses or deaths and PCAH’s researchers said neighbours had told them of other people who did not respond for the same reason. So in the PCAH study self reporting biased the results towards underestimating the hazard.

6. COMARE simultaneously accuses us of under-ascertainment and over-ascertainment.
The criticism that PCAH only found 27% of the cancers registered by the SWCIS arises from COMARE’s big mistake
(i.e. that only 30% of the population had filled in questionnaires) . In fact, since PCAH surveyed 30% of the people in the North ward, finding 27% of the cancers present in the whole ward is pretty impressive reliability. (Maybe the missing 3% were the people who were too upset to do the questionnaire.)

PCAH found 100% of the cervical and kidney cancers. COMARE questions this on the basis of the alleged self reporting bias, but the fact that the PCAH volunteers found all the kidney and cervical cancer cases is curious - nothing more. What exactly is COMARE’s point anyway? Burnham has large, undisputed and statistically significant excesses of both these cancer sites which require explanation. Would we ever have known about this if PCAH hadn’t had the guts to go out and get the data direct from the public?
September 2004 Note: COMARE's revised Statement, despite admitting the earlier mistake about the size of the response, has barely changed the earlier section 4 which is almost entirely based on the mistake. On a separate page (click here) we reproduce the offending section, with interpolated notes (in our house red) to point up the incompetent partisan nonsense.
(For those who are already sufficiently enraged here is a link to a page which offers a small practical remedy - a motion in the UK Parliament asking Mr. Blair to scrap COMARE and start again.)

7. COMARE criticises us [5 (b)] for directing some questions only to the people with cancer. We accept this, but it was done to facilitate data collection in a study which (to repeat) was intended to be indicative, not definitive. What COMARE fails to account for is the "respondent overload" problem inherent in any questionnaire:- the form has to be designed not to discourage participation. For example our study says (p. 9) No internal controls were available [for lifestyle questions] since it was considered that this would make the questionnaire too complicated. We made comparisons with national averages and, recognising the overload problem, PCAH supplemented the study with an entirely separate street survey of life-style. This identified that the population stopped on the streets had exactly the same proportions of smokers and water sports participants as in the cancer patients.

8. COMARE says [6(a)] that we have confused prevalence with incidence. They list this as a problem and they seem to be saying that we misused the word prevalence, but they do not say in what way this could be problematic. We conclude that it’s just snow. The PCAH paper shows unequivocally that the phenomenon being studied is cancers diagnosed within a well defined population within a given period, irrespective of whether the person was still alive at the time the data were collected. We were therefore dealing with both incidence and mortality, to which we applied the term prevalence. What’s wrong with that?

9. COMARE made much of observations (made by SWCIS) that SRRs were elevated in wards remote from the contaminated mud as well as in Burnham and Highbridge. This is disingenuous, since the remote wards (which we can identify as Exmoor, Haddon, Blackdown and Wessex) have very small populations. Fluctuations in cancer rates in such populations are not statistically significant at the levels found.

10. COMARE says It is not clear if [the] cases [reported by GA] were diagnosed while a resident of Burnham. This is completely untrue; the questionnaire included a question about people diagnosed with cancer leukaemia or lymphoma, asking how long they had lived there.

11. COMARE complains that The two periods analysed, 1996 - 2001 and 1998 - 2001, overlap considerably. The results for the two periods are not independent of each other and therefore the quoted p values do not properly represent the evidence. This is nonsense, calling into question COMARE’s understanding of epidemiology and Poisson statistics. They are citing a rule by which epidemiologists seek to prevent the data which generate a hypothesis (e.g. the discovery of a leukaemia cluster) from being included in a subsequent studywhich might be set up to test that hypothesis; one is supposed to look for a separate population exposed to similar hazards or a different time period for the same population. What we did was quite different. We analysed two overlapping time periods for the same population with the express intention of testing the influence of data leakage (which we discuss in para. 5 above). The P values are valid since Poisson can be used to test the significance of any question one cares to ask about data. In this case they stand, independently, as tests of the significance of what was found. We did not multiply the p values, as we could have done if the data sets had been independent, as in the case of the increases of infant leukaemia in a number of countries after Chernobyl.

12. There is one interesting and highly questionable discrepancy between the SWCIS study and COMARE’s Statement, which is that COMARE reduces the SWCIS study to one ward. To repeat, the PCAH questionnaire survey was an indicative study and the SWCIS study is to be welcomed for extending the investigation to areas PCAH could not attempt. Why didn’t COMARE look at the data SWCIS produced for the larger area, as we now have (see below Para. 14)? Possibly because they didn’t want to draw attention to the fact that the risks in Burnham North are mirrored in the surrounding wards.
Other dubious aspects of COMARE’s analysis are:

  • that they did not use Poisson for the small numbers involved. It is generally accepted that Chi squared cannot be used for such small numbers because of Gaussian approximation;
  • that they seem to have used a statistical significance figure of 1% on a one sided test. This is outrageous. The 5% level on a two tailed test is standard, especially in such a small study.

13. All these mistakes in COMARE’s Statement, many of which they have reproduced parrot fashion from SWCIS, are due to a failure to communicate. This is a cultural problem of remote and arrogant authorities. If COMARE and SWCIS had been willing to discuss the issues with us the public would not have been misled by them, but there is every indication that their primary aim is to shoot the messenger.

The SWCIS study (SWICS 2003) is to be welcomed since, as the CBOs wanted, it extended the research beyond what PCAH could conceivably have attempted. It also serves as a source of some of the cancer incidence data that had been requested. In Fig 3.2 of the report SWICS gives a series of Standardised Registration Ratio (SRR) vs. distance scatterplots together with the results of linear regression analyses for the cancer sites examined, namely all malignancy, lung, breast and leukaemia both for distance from the Power Plant and from the centre of the mud flats. SWCIS has not used the same centre for the mud flats as we used in the cancer mortality studies, however the difference is small and probably has little effect on the results. Here we draw attention to the real effects shown by the scatterplots given by SWCIS. These are fairly obvious to the informed eye; at the public meeting in Burnham (7th May 2003) at which SWICS presented their findings Dr Richard Lawson pointed out that a sea coast effect was discernible. SWCIS refused to give the data on which the graphs and their analysis were based. In order to make the calculations reported here it was therefore necessary to take the data points off the graphs. Two of these graphs were photo-enlarged and the data points were taken off using dividers and ruler. This gave the raw data on which the SWCIS statistical analyses were based. The SWCIS conclusions were that there was no statistically significant sea coast effect of the type we described for mortality. However they had examined all wards in Somerset out to 60km from the coastal point of interest, a distance over which it is unlikely that any sea coast effect due to sea to land transfer of radioactivity could operate. Using the data we obtained from the graphs we examined the effect of looking at a smaller distance out to 35 km. The results for ‘all malignancy’ based on (a) Hinkley Point and (b) the mud flats as the centre of risk is shown in Table 2 below which gives the results of ordinary least squares regression for 0-60 km and 0-35 km wards.

Wards within km β R-squared "t"-statistic p-value
0 - 60 (SWICS) not given 0.001 not given 0.779
0 - 60 (this response) -0.06 0.004 -0.7 0.48
0 - 35 this response (a) HP -0.39 0.05 -2.2 0.032
0 - 35 this response (b) mud -0.44 0.06 -2.35 0.021
Table 2 Regression coefficients for the equation SRR = β*distance + E using data from SWCIS for all malignancy SRR for wards in Somerset 1990-9

This analysis (where β is the gradient and the t statistic is an expression of the goodness of fit) shows that the sea coast effect is present in the SWCIS data for all malignancies SRR. The use of a 60 km radius of effect by SWCIS effectively removed the effect by weighting the regression line with SRRs from wards remote from Hinkley Point in which it is extremely unlikely that there could be any exposure related effect. Inspection of the results given by SWCIS suggests that this result is also true for all the other cancer sites they examined, particularly leukaemia. The increases in the effects on moving the putative source centre from the power station to the mud flats is clear in all the SWCIS published scatterplots and also in the statistical data given on them in the SWCIS report. This can be seen in Table 2 above where the p-value falls from 0.032 to 0.021 on moving the origin of the effect to the mud flats.
Conclusion
The sea coast effect which we found in mortality studies is reproduced in data provided by SWCIS. It was their use of a much larger assumed distance of effect, exacerbated by the decision to fit a straight line rather than a curve, that has enabled them to falsely argue that there is no statistically significant effect.
15. The Breast Cancer Screening van Effect
SWCIS has claimed that the apparent excess of breast cancer is caused by peaks in registration which represent the cases discovered by the screening and that they are followed by dips which cancel them out. This is fatuous. There is a 2-fold excess of breast cancer in Burnham over a 9 year period 1992 - 2000. This is apparent in SWCIS’s own figures. Over this period any short term local variations caused by screening would be smoothed out and UK average rates represent a valid control population - are they not also being screened? The excess is also seen in breast cancer mortality, although it should have the opposite effect unless X-ray mammography is killing people.
16. Chronic Lymphocytic Leukaemia
There is no evidence that chronic lymphocytic leukaemia (CLL) is not caused by the internal radioctivity which the people of Burnham inhale, ingest or absorb through their skin. COMARE’s assertion is based on a radiation risk model derived from observations of cancer and leukaemia in the survivors of the Hiroshima and Nagasaki bombs. As has been pointed out many times these people were exposed to acute external radiation and the results of studying them cannot be extrapolated to chronic internal contamination.
17. Secrecy.
We stand by our criticisms of the secrecy surrounding cancer data. The central point is that it’s not about patient confidentiality at all - it’s about the authorities avoiding potential embarrassment, since health data can be used to reveal risks associated with point sources of pollution and other hazards.
This is information which the public owns and public trust in its value and usefulness is eroded if public interest groups (such as us) are not allowed to use it for investigating well specified hypotheses. Over the last ten years new and ever more restrictive rules have been applied in order, it is claimed, to protect patient confidentiality. There used to be a reasonable criterion that confidentiality would be safeguarded so long as the study population (the “denominator”) was at least 1000. For example if there were 1000 women in Burnham North, then releasing the number of breast cancers diagnosed in a year (the “numerator”) would not allow any of the patients to be identified - even if there was only one patient. However, a new criterion for the size of numerator has now been added - information will not be released if data cells have fewer than 5 counts. For many cancer sites this effectively blocks release, especially in rural areas where populations are so small that annual data will commonly have fewer than 5 counts per ward.
The Office for National Statistics (ONS) and the Department of Health adopted this rule as “an interim measure while risks of other methods [of protecting confidentiality] are evaluated.” Last year we asked Dr Peter Goldblatt, Director of Health and Care Division of ONS, what the other methods might be and who was evaluating them, but he hasn’t answered.
A letter from ONS and the Department of Health shows that their “interim measure” is meant as a guideline for publications, web sites, annual report and open., publicly accessible literature like that, but they are using it to deny data even to Government Ministers. It has now been extended to mortality data as well. Whereas over the last few years ONS would (at a price) sell electronic breakdowns showing causes of death for small areas, they have now realised that this too is potentially embarrassing to the authorities.
We presented these considerations to the International Workshop of the Committee Examining Radiation Risk of Internal Emitters in Oxford in July 2003. Delegates supported our criticisms of the new 5 counts rule, saying that it was too restrictive and that many researchers and the cancer registries themselves were restive about it. The Director of one national Cancer Registry said there was no good reason for it.
We are already working with some data in other forums and have signed confidentiality agreements. The important thing is to get the work done; confidentiality is merely a matter of ensuring that, when it comes to publication, the results cannot be used to identify vulnerable people. All along we have proposed working with the Health Authority and the Cancer registry to agree protocols. Then we and the authorities would have equal access and the public could feel that they weren’t being hoodwinked.
References

Bowie C and Ewings P D (1988) Leukaemia incidence in Somerset with particular reference to Hinkley Point Taunton: Somerset Health Authority

COMARE 2003 Statement on Green Audit Occasional Paper 2002/5 “Cancer in Burnham on Sea North: Results of the Parents Concerned About Hinkley (PCAH) Questionnaire” (COMARE 25th November 2003)[http://www.comare.org.uk/statements/comare_statement_burnham.htm] (This hot link takes you to their amended version. The original is no longer on the COMARE web site. We can email it to you if you want it.)

Green Audit 2000 Cancer mortality and proximity to Hinkley Point nuclear power station in Somerset 1995 - 1998 Part 1 Breast cancer, Part 2 Prostate cancer Part 3 All Malignancy Lung cancer Stomach cancer and Summary of results.

Green Audit 2002 Cancer in Burnham on Sea North, results of the PCAH questionnaire: Busby C, Rowe H Green Audit 2002/5 Aberystwyth

NRPB 1988 Gamma radiation levels outdoors in Great Britain NRPB R 191 Chilton

PCAH 2004 Julian Plested letter to Bryn Bridges 26th February 2004

SWCIS 2003 Cancer incidence in Burnham North, Burnham South, Highbridge and Berrow. South West Cancer Intelligence Service September 2003.


If you are seeing this page full screen (i.e. without a navigation bar on the left) you can't see how the rest of the site is organised.
This Home page link takes you to the index page, which has links to all the topics we discuss on the site [only use it if this page is full screen]
Use the Health Effects of low level radiation button to see what else we have to say on this topic.


Send email to:
SiteManager@llrc.org with questions or comments about this web site.