Saturday 12 April 2014

Too much fruit and veg is bad for you, says study

You remember the study that suggested the 5-a-day fruit and veg advice should really be 7-a-day? Or even 10-a-day?

Well, I just read a study that suggested – using exactly the same methodology – that 7-a-day is WORSE for you than 3-a-day. Yes, THREE a day.

And you know what?

It was the same study.



The current coverage of the Tamiflu scandal (and £500 million fraud) is one example of science being misrepresented for gain, but the recent paper suggesting that the current five-a-day fruit and veg advice should be changed to seven-a-day was another example.

And while not in the same ballpark of atrocity, it is, I think, worth pointing a shaming finger at.

While some media reports suggested the study had problems, there was a glaring issue that meant the researchers – or at least, whoever wrote the press release – should hang their heads in shame, as should the media for not calling them on it.

Because when it was time to create a press release, they took one result from their paper and went for the spin that would get the one thing they really wanted: media coverage.

Here is the paper.

Their conclusion is actually fairly reasonable. The problem is when the press release says this:

‘The study findings imply that even those who do get their recommended quota, need to eat more’, they say. ‘Is it perhaps now time for the UK to update the ‘5 a day’ message to ‘10 a day’? they ask.


This is nonsense, and it is intentionally provocative nonsense aimed at ensuring news coverage.

(Let's face it. This is how researchers get their grants.)

Hopefully, this is especially good news to all you evil parents who thought you were damaging your kids by only managing to give them 3 a day. You are not evil. Tired, but not evil.

I’ll remind you of the study. Take 65000 people, ask them about their lifestyles, and check back in eight years to see who died.

Then, try and work out (adjusting - somehow - for age, education, BMI and others) if the risk of death correlates to intake of fruit and veg.

The media tended to quote one set of figures that suggested that, relative to the baseline risk-of-death of people who ate very little fruit and veg (less than 80g a day), your risk of death decreased as follows (portion=80g):

Eating 1-3 portions a day: reduced by 14%
Eating 3-5 portions a day: reduced by 29%
Eating 5-7 portions a day: reduced by 36%
Eating 7+ portions a day: reduced by 42%

Wow! Look at that! Eating 7+ is 42% better! It’s 6% better than 5-7, and is 13% better than 3-5! And that’s science!

But not so fast.

The media were always going to focus on the most extreme of their results, and to suggest it’s quite so clear-cut is a teeny bit of a fib.


First, there is the simple fact that averaging out results can be uninformative at best, and at worst can be damaging. The results – and hence the advice – could be very different for men and women, for example, which is the case for the advice on daily calorie intake.

Luckily, the researchers give us a few of these alternatives, restricting the results to certain groups and seeing how the figures look. And they all pretty much say the same thing, right?

Wrong.

The most striking example was obtained by restricting the study to what they call never-smokers: people who have never smoked regularly, which was almost half of the study population.

The headline for this is very different: too much fruit and veg is bad for you.

What? How? The figures are as follows:

Eating 1-3 portions a day: reduced by 6%
Eating 3-5 portions a day: reduced by 24%
Eating 5-7 portions a day: reduced by 28%
Eating 7+ portions a day: reduced by 23%

The first thing to note is the big jump at 3-5, going from 6% to 24% reduction, which is encouraging for those of us who try but fail to hit the five. Then, 5-7 gets a modest improvement, with a 28% reduction. But look carefully at what happens next. The 7+ group have a… 23% reduction?

Yes, the 7+ group do worse than the 3-5 group.

The simplistic conclusion from this result would be that eating seven-plus portions, for non-smokers, is WORSE for you than eating 3-5 portions.

Now, the researchers were happy to suggest that public health policy should change based on a 6% difference in one set of results. Yet in another set of results, with a 5% difference that seems to imply that eating more than seven-a-day is a BAD idea, they don’t mention it.

(Note that the non-smokers had a much reduced risk anyway, so all these figures are relative to a baseline that already has a far lower risk of death than in the general case. That's right, folks - if worrying about not eating enough veg made you so stressed you needed a fag, you have just been the victim of the Universe's twisted sense of humour.)

It would be fascinating to see the figures for the have-been-regular-smokers group, as well as the current-smokers group, as they surely have to show a much larger positive effect for 7+ than even the media-friendly overall result showed.


Also, as the media did typically suggest, the idea that you can successfully adjust the figures for age, social class, BMI, etc (and, as in one great phrase in Table 5, “adjusted for age, sex, social class, cigarette smoking, BMI and all other fruit and vegetable variables”) is stretching things somewhat.

That kind of statistical messing-about might be useful to show that there is an effect, yes, but to suggest that the result is then something people can directly apply to their own situation is misleading. Because you have adjusted the figures.

As far as I can see, the paper itself doesn’t even go into detail on what magical techniques achieve this untangling of variables.

Now, I’m no statistician, but I suspect there’s more than one way to do it, which would seem to offer the opportunity for endless after-the-fact tinkering until you get the results you want or expect. Which is handy.

Add to this the fact that several different analysis methods were used, producing notably different results. Also, the media-quoted stats excluded people who had died within one year, in an attempt to exclude those who were seriously ill and hence may have adopted some emergency dietary changes (which would make healthy food unfairly correlate to being very sick). It would be interesting to know if the choice of a one-year cutoff was made because it led to the best results... Also, though - it was a shame that the public health questionnaires they got their data from hadn't asked pertinent questions about people's actual health, like 'have you dramatically changed your diet recently because of health issues', say, or 'are you really really ill'?



Last but not least: another problem is how broad the 95% CI ranges are.

In the figures given, they provided 95% confidence intervals. These (sort-of) are the expected range of the ‘actual’ values for the figures they give (more accurately, we would be 95% sure that another similar study would produce figures in those ranges).

Broad ranges are bad news… and these are broad.

For the headline figures, we have the following percentages quoted, here with the 95% CI ranges:

1-3: 14% (5%-21%)
3-5: 29% (19%-37%)
5-7: 36% (24%-47%)
7+:  42% (29%-54%)

These are pretty wide ranges. They’re all basically plus-or-minus 10%. If we had another study showing 29% across the board for everything from 3 and up, it wouldn’t contradict these results.

At all.

And they say we should change public health policy based on this?

Of course they do.

That way, it’s news. Any other way, and the study just gets ignored.






NOTE: Some of the figures in the paper are clearly incorrect. For example, the introduction quotes 48.4% of those in the study as having never smoked regularly, in a sample of 65226, yet the figure quoted as the sample size in the never-smokers result was 43973, which is actually 67.4% of the sample.

Also, one of the analyses quotes a study size of 84,894 participants. Out of the 65226 taking part.

Another point, though: Imagine how much fun it would be to have all this data publicly searchable…

(And yes, I do have a deadline approaching, which is why I ended up spending two hours reading a paper on vegetables...)



EDIT: I just noticed that in their results table, one of the sets of results is labelled 'physical activity years only', and those results are amazing compared to the others - I mean, they're really the best set of results they have, overall.

So, what does 'physical activity years only' mean? It must be meaningful, right? Maybe it rules out really old people who don't get around so much?

No. What it means is this: out of the years the data was collected (2001-2008),  only in some of those years were there questions asked about physical activity levels (2002, 2003, 2004, 2006).

So, that amazing set of results is for, really, just a RANDOM SUBSET of the data. The physical activity questions have nothing to do with the actual results. It does, though, give them a better hook to hang the results from than just 'random years which made the results look great'. They should have tried all possible combinations and subsets, maybe there would have been something even more misleading impressive.


 

EDIT: I also just noticed that my description of the study itself was wrong: the data came from surveys done over 2001-2008, but all the mortality data came from 2013.

In other words, the people surveyed in 2001 had an extra seven years to die compared to those in 2008. I don't think they specified how many deaths were recorded for each of the survey years, but I would be willing to bet that the people in the 2001 study showed the most deaths.

They also don't specify if the dietary habits showed any trends across each year, but it would be worth looking to see if fruit&veg consumption in 2001 was typically lower than in later years, since that would almost certainly be the year that would experience the most deaths, thereby biasing the link between deaths and lower consumption.

No comments:

Post a Comment