Monday, March 31, 2008

The Rise & Fall of Lifestyle Nutritionists

Just a quick note to say that Ben Goldacre's excellent series on the The Rise Of Lifestyle Nutritionists continues on BBC Radio 4 tonight (31/03/08) at 8.00pm.

Because of the unique way the BBC is funded (i.e. by ME) it means that non-Brits can still get access to all the good stuff that BBC churns out (and the not so good stuff) by using the BBCi Player.

EDIT: Apparently you can't! BBC iPlayer is only for UK, however you will be able to listen to the show from outside UK using the BBC's Listen Again function, but only for 7 days!

It has already ruffled a few feathers of old friend Patrick Holford (Pilltrick to his friends) as has been blogged by HolfordWatch.

Should make for entertaining listening :)

EDIT: Join the discussions here

Tuesday, March 25, 2008

Bluffer's Guide to Consumer-related Science Papers - Part 2

In Part 1, we looked at a few basic questions to ask when looking at a scientific paper to establish whether the research is much cop.

The questions were as follows:

1. What Journal is the paper published in and is it any good?
2. How many people were in the trial and how many dropouts were there?
3. What was the control and what was the active ingredient?

And one suggested by Martin @ LayScience:

4. Who are the authors, and how were they funded?

The paper that has thrust itself recently into the headlines is this one:

Cod Liver Oil (n-3 fatty acids) as an (sic) non-steroidal anti-inflammatory drug sparing agent in rheumatoid arthritis

B. Galarraga et al, Rheumatology, 2008, EPub [Ahead of Print]

The reason why this paper is a good one for learning how to read scientific papers should become apparent as we go on. Needless to say, the Dundee University well-oiled press machine is working at full strength (remember "Black Tea Helps Combat Diabetes"?) and so the story has been picked up by the BBC and Daily Telegraph amongst many others. The prize for the worst headline, incidentally, goes somewhat surprisingly to Channel 4 news who fell into the age-old trap of confusing the pain caused by the disease with the disease itself (Cod Liver Oil Does Ease Arthritis) - even the Daily Mail managed to stay out of that trap, but happily feeds the press release to the ScienceMangle™ and prints whatever comes out.

In essence, the message is taking cod liver oil capsules can mean cutting down on painkillers (and hence their side effects) for people with rheumatoid arthritis.

That, to me, if true, is a Good Thing. Let's look at the research:

Firstly, it has the golden words "This was a double-blind, randomised, placebo-controlled study". Music to the ears. Proper job.

1. What Journal is the paper published in and is it any good?

Rheumatology. No problems on this front - it is a peer-reviewed PubMed journal.

2. How many people were in the trial and how many dropouts were there?

Hmmm... no so easy to answer. The trial ran from at two locations from August 1997 to December 2002. (Why the dickens it has taken over 5 years to complete the analysis is beyond me). In total, 97 patients were part of the study. This is close enough to our n>100 rule of thumb for it to not matter so much, however, only 58 patients completed the study. Suddenly the results are looking shaky. Without going too much further, with n=58 it will be difficult to make firm conclusions, even if all the other parameters are fine. Why did so many drop out? Well, the paper gives a good critique (as it should) of why people dropped out. One reason I would like to press further (see below) is that about 20% of both Cod liver group and the placebo group dropped out because of voluntary withdrawal.
Amongst concerns raised by this group of patient were the large size an number of capsules to be taken daily, awareness that the capsules were empty and dislike of the fishy taste of the capsules.

Ah. This should have set alarm bells ringing. It will do when we look at the next question:

3. What was the control and what was the active ingredient?

The active ingredient is n-3 (also called omega-3) fatty acids found in cod liver oil (specifically, Seven Seas Marine Oil 1, an n-3 long chain fatty acid-rich clinical grade high strength cod liver oil).

The control was "an identical air-filled capsule". Hmmm. Identical to look at perhaps, but weight? Hence the worrying comment about "awareness the capsules were empty". And the fishyness? This to me is a shame.

The whole concept of double-blinded placebo controlled trials is based on the fact that neither the patient nor administerer (hence double-blind) knows which capsules are which. It turns out that both knew which was which - certainly, anyone burping up clouds of fishyness had a fairly good idea.

Why did the researchers not fill the capsules with a non-fish fatty acid, as these Dutch researchers did when trying to find out if n-3 fatty acids in fish oil prevent cardiac arrythmia - the placebos were a high oleic sunflower oil. Obviously the fishyness may still figure, but this to me is a basic cock-up. Again, in fairness to the researchers, they critique themselves admirably:

We may have compromised the double-blinding of the study by using air-filled capsules as placebo. Although it was recognised that some patients would discover their capsules to be empty and others may realise about the capsules lack of fishy smell and taste, the air-filled capsules were selected as being the most appropriate placebo available after critical appraisal of alternatives. The possibility of using capsules filled with other fatty acids was rejected as none are believed to be truly inert and saturated fats may be associated with a health risk.

I don't buy either of those two excuses. It sounds like (but may not be so) the peer-reviewers have insisted that there be something in the discussion about the double-blindedness of the trial. Perhaps it something else.......

4. Who are the authors, and how were they funded?

Aha. The authors themselves are not an issue, but the funding came from Seven Seas themselves. This does not need to be an issue, but it can provide a bit of fodder for discussion. For instance, if you had a product that you wanted to sell more of, it is important to get some research done and, on the back of the generated PR, sell more product. Providing the research is independent and of good quality, then it shouldn't matter where the money comes from. A sceptic might say that Seven Seas wanted the research to be good, but not that good.
Certainly, the effort over 10 years to gather data, randomise the trials, have 2 seperate centres, publish in a quality journal, team up with sound scientific researchers is commendable - many less upstanding companies might just have given the fish oil to a load of schools and told them to eat up without really doing it properly. And yet, to cock the blinding up so completely as to lower the worth of the paper from research to PR chaff is a sorry shame.

In fairness to the researchers and to Seven Seas, they published the research, held it up to scrutiny, paid for open access for the public (Rheumatology is normally pay-walled) and did find (in some small way) a small but significant reduction in self-reported pain (very susceptible to placebo) in the cod liver oil group compared to the placebo group.

Surprise surprise, the sample is too small and there are other 'issues' with the experimental to draw firm conclusions, but as we saw with Life Mel more research is needed, but the papers are already full of suitable PR that I'm sure Seven Seas will be very happy with.

I wonder will they ever get round to doing more research? If it took them 10 years to do this study, I wonder how long it would take to do a large one.

Of course, they don't mention (and why would they?) that 10 capsules a day of 1000mg High Strength cod liver oil is a LOT of pill swallowing. Neither do they mention that this will cost over £10 a week. And of course they don't mention that it would be much better to eat oily fish and by having your 4 portions a week as recommended by the Food Standards Agency you can a have a much more enjoyable time ingesting the omega-3s (and get a similar amount).

Seared tuna with fresh lime vs 10 cod liver oil capsules. No brainer.

(With thanks to Gimpy and JDC).

EDIT: I should also point out that the NHS Behind the Headlines page has covered this here

Monday, March 24, 2008

Not Snowed Under...

Just spent the main part of the weekend traipsing the first 30 miles of the South West Coast Path (in glorious sunshine, and not snow like apparently everywhere else) and got back wanting to blog the sinister PR emanating from the Catholic Church this weekend regarding the Embryology bill. Turns out Joe @ Cotch dot net has written it for me (!), so I'll tend to my aching muscles and send you there. The cream of UK intelligentsia has been collated here at the Apathy Sketchpad.

Sorry for wussing out :)

Tuesday, March 11, 2008

Bluffer's Guide to Consumer-related Science Papers

I've been wanting to write something along these lines for a while and the opportunity seems to have presented itself at last. (Which is poor timing, as I won't have time to tell you about the teapot-worshipper who is going to prison for not abandoning any teachings contrary to Islam).

Wading through PR blurb and advertising copy that relate to 'sciencey' things can be an effort, but with a few key pointers, sense can be made of most of it - for instance, a company will put the strongest claim possible on its strapline, so woolly words like 'may', 'could', help', 'aid', 'reduce', 'some' all go some way to indicate how much of the products claims are real and how much are puff (perhaps a blog post for another day).

Things can get much trickier though, if the advert in question claims "proved by scientific research". If there is grounds for this, the company will no doubt give full access to the research, otherwise, it's fairly safe to assume that it is selling a bum steer.

So far, so straightforward. Let's say our imaginary company has a product, which claims to have scientific proof for its claims AND is presenting the research. Obviously, advertising will already have convinced/dissuaded a section of public; the "proved by research" strapline will convince another swathe and the presentation of gritty full scale science research paper with all the reader-friendly niceties (and pictures) removed, will certainly convince another swathe, especially if the reader is not used to dealing with scientific papers.

If you are like me (and if you are, I empathise with your shortcomings) then, depending on the product, the paper may or may not be read. It all depends on the claim. If a toothpaste claims to whiten my yellowing teeth, the paper may not get read. If a nutriceutical claims to alleviate my arthritis pain, then I am reaching for my intelligence glasses. (I don't wear glasses, but pretending to makes me feel more intelligent).

I have some experience in reading scientific papers. But only really in my area, outwith of that, it can be tough going. What hope is there for someone who is not used at all to dealing with the curt, unloving, perfunctory monotones of thick research text?

With a bit of persuasion, I hope to have here Dr* T's Bluffer's Guide to Published Research. This will probably be a work in progress, so I'm keen to hear of any additional points which will make it easier for people to wade through.

What I've decided to do is use a paper I've been reading recently as a case study.

The product in question is Life Mel Honey.

As a beekeeper, I was interested to learn the nation's most paranoid shopkeeper, Mohammad (Not The Teddy Bear) Al Fayed has started selling Life Mel honey in his large corner shop at the very reasonable price of £42 for a small jar.

The reason that this product commands a high price is because (apart from beeing in Harrod's) there is some research to show that Life Mel honey can help patients having chemotherapy avoid a complication called neutropenia. This is caused by a lack of neutrophils (a type of white blood cell which serve as a defence against infections), which when low (as can happen during chemotherapy) can compromise the whole immune system and can be life-threatening.

Sounds like good stuff - tell me more. Firstly, the advertising blurb - it's a nice website, we get directed to Holywell Health products, the distributor of Life Mel in the UK, which proudly claims the research published in Medical Oncology and has kindly put a link to the .pdf of the paper.

Zidan J, Shetver L, Gershuny A, Abzah A, Tamam S, Stein M, Friedman E.;
Prevention of chemotherapy-induced neutropenia by special honey intake.
Medical Oncology 2006;23(4):549-52.

So far, full marks. How do we tackle the paper?

What Journal is the paper published in and is it any good?

If I started up Dr*T's Journal of Stuff, I could print what I like, and apart from a silly name, no-one would know whether it was any good or not.

To find out a bit more about a journal, start at PubMed. Medical Oncology is on there, Dr*T's Journal of Stuff isn't. If the research journal is not on PubMed, I wouldn't bother reading the paper. Med. Oncol is a peer-reviewed journal, so no qualms there. On to the actual paper...

How many people were in the trial and how many dropouts were there?

(Technically two questions, but no matter). This is called the 'n'. For 'number'. Clever. How many patients took the treatment? The bigger the n the more reliable the result.

e.g. I flip a coin twice, it comes up heads both times.
Result: It is a two-headed coin. (n=2)

I flip a coin 100 times, it comes up heads 100 times.
Result: It is a two-headed coin (n=100).

There is no hard and fast rule to this, but I would reckon that the general consensus of sciencey types would say that n<100 is not much cop. It may be a small scale trial which could then justify a larger scale trial, but the trial by itself should not be relied on too heavily.

In this case, n=30, so the results in this paper are already hanging by a shoogly hook.

The dropout number is also important. Without going into details of bias, it's a bit like saying in our experiment above "I flipped the coin 100 times, but I only looked at the result on 10 occasions". In this case, as the patients were being given the honey as well as their medication, you wouldn't expect dropouts and indeed all 30 completed the course.

What was the control and what was the active ingredient?

It's important to know what is being tested. In this case, Life Mel honey is being tested. So (in my simple head) one would expect the basic trial to be set up so that some patients got Life Mel honey, some patients got 'every day' supermarket honey, some people got a sugar syrup and some people got nuthin'. Because this particular experiment is in addition to the regular medication and treatment that the patients should be getting, there's no reason why it couldn't be done cheaply and easily.

We need to know if an effect is seen whether it is because of a placebo effect (compare with people who got nothing and with sugar syrup). If not, is it because of something inherent in honey or something inherent in that honey?

This is where the paper crumbles. There is no control - all 30 patients in the trial received honey. So there is no way of knowing if the response seen is because of placebo effect or some other feature of the experiment unrelated to honey.

There are other questions to ask (which will get added here in time) but for me, this paper is passingly interesting but of no value without further research.

Bear in mind that we haven't really read the paper, or gone through the biology of the process or the analysed the methodology - just a few easy questions has weighed up this paper as being very much in the "nothing more than PR" camp.

It may be the case that further investigation will lead to show some effect with everyday honey, or even Life Mel honey, but with the evidence given, it isn't possible to make the call.

However, as a salesman, I think Life Mel have done just enough to market the product to the vast majority of people and have done well. The 20 people who will actually see the research as being of little value without follow up work (it was published 2 years ago) will not be the target market. Especially now they have celebrity endorsements and a full raft of papers running with the story, including The Daily Mail taking it to extremes - personal anecdotes sell product.

I wonder will we see the large scale trial being published in due course?

I'd be really interested if they do, but I am, as ever, sceptical.

If you have further questions that you think could be of benefit in weighing up a published paper, please feel free to leave commment - thanks.

Sunday, March 2, 2008

Why is diabetes not everyone's cup of tea?

Heh, sorry, the title is the question that would be asked if the following BBC article was on Radio 4's The News Quiz. Something that gets banged on and on about in popular scientific press is the level at which science is pitched in "The News". Here we have the BBC taking a paper that has been published in an academic journal and removing anything of interest, which makes the BBC report imply something which, although not technically wrong, is certainly unfair to the truth.

Tea could help combat diabetes.

Drinking black tea could help prevent diabetes, according to new findings by scientists at Dundee University.

There's your hook - drink more tea, don't get diabetes. Fantastic, tell me more. In fact, don't.

Let's stop there. You see, the full paper published by Neurosciences Institute @ University of Dundee and Scottish Crop Research Institute, Dundee is available online (for free - hoorah!) here in the journal Aging Cell. In actual fact, what the researchers did say was (although this is not my area of expertise - let me know if inaccurate) that a member of a family of transcription factors (proteins that can turn specific genes on and off) called FOXO1a which undergoes phosphorylation (addition of a phosphate group) induced by insulin, can also undergo the same reaction induced by some theaflavins, present in black tea. (And, to slap BBC with using ambiguous imagery and wording, the reason they discuss black tea is not to do with milk destroying the effect, but to compare it to green tea, also studied in the paper.)

Aha - so it is suggested that this one reaction may be able to be encouraged to proceed even if insulin is not present. Most scientists would agree that biochemical pathways are pretty straightforward, as this magnificent page from ExPASy indicates.

(Having a wander round this map of metabolic pathways gives me a sense of incredulous wonder and conjures up many pertinent questions - how can homeopaths even begin to explain miasms or any other of the outdated, anti-knowledge hoo-haa when confronted with something like this? I digress). We are talking here about one reaction in a biochemical pathway.

So indeed, we have one transcription factor which has shown to be sensitive to theaflavins in the same way it is sensitive to insulin. How soon before we can get tea on the NHS? Don't get me wrong, the research seems to be good, well written, solid work, but to say that drinking black tea will prevent diabetes from the research done is nonsense. Indeed, the researchers themselves are quoted as saying:

Our research into tea compounds is at a preclinical, experimental stage and people with diabetes should continue to take their medicines as directed by their doctor.

Bravo. So possibly (if everything goes according to plan) in about 10 years we may have some proof that some of the compounds in tea can help to reduce the instances of Type 2 diabetes. Is this newsworthy? Should the BBC report these stories? In my view, no, or at least they should have a rethink how they write them. If they are going to claim that "Tea can help reduce diabetes" they should at least have some sort of time scale in the report, and an indication of the potentials and limitations. The lack of any science worth noting (I was surprised that the words 'molecular biology' weren't used - or even 'biochemistry') is also disappointing. To me, it seems the BBC doesn't want to 'get all sciencey' because a) it will lose some viewers and b) the chances are the journalist may not be scientifically trained. It is however happy to overinterpret papers and aggrandise possible distant future events as imminent in order to fill sensationalist copy and satisfy its quota for 'Science'.

In case you might be under the impression that this is a one-off, the BBC has form and a long standing relationship with tea:

It "Reduces ovarian cancer risk", "could cut skin cancer risk" and is a "healthier drink than water", although this research was funded by The Tea Council, so further probing might be needed. Also interesting is the BBC puts as much effort into its picture choice as it does science writing.

I fancy a cuppa.

EDIT: The original press release is here and, apart from mentioning "test-tube studies" is many times better written than the BBC article it spawned.