JN interview, 12.12.06

December 18, 2006

1. Before becoming a journalist, did you attend any sort of journalism course? What type?

If yes: were there “ethics” classes on that course, or were the ethics of journalism discussed at all?

If no: what sort of on-the-job training did you do, and were ethics a part of that training?

Since leaving training and becoming a journalist, are ethics (still?) discussed at all

In your opinion, should they be? Please note – this isn’t a trick question and I’m not presupposing the answer “yes”. Most people presumably have some idea of what it means to behave responsibly. There may be no need to formalise it in the workplace.

2. Is a working knowledge of the topic you are writing on an ethical requirement? To what extent is that the case?

3. As a health writer, how do you deal with the specialist knowledge the topic entails? Have you taken any special training or courses? Did your editor or any of your superiors insist upon these, or were they your own idea?

4. Naturally we can’t expect every health journalist to have a medical degree, so presumably there will be times when the topic will require specialist knowledge that the journalist in question simply doesn’t have; perhaps a grounding in statistical analysis, for example, to understand risk probabilities in a health-scare story. How does a conscientious journalist deal with this problem? For example:

  • Whose opinion should a journalist trust? 
  • How many opinions should they get before making up their mind, and if scientific opinion goes against the journalist’s instincts, how far should they push it?
  • What should we consider “balance”? Does taking two extreme views constitute balance?
  • 5. When reporting on the results of a new study or trial, do journalists tend to simply use the press release or do they read the whole report? Again, I’m not necessarily suggesting they should.

    6. Are there any other problem areas for health journalism that you feel are interesting?

    7. Do you have a science degree? (Probably should be question 1 or 2)


    Did eight weeks of a 12-week course at the London College of Printing; ethical component consisted of the part-time Law correspondent at the Sun giving a weekly lecture which John described as “the law and how to avoid it”. John is, however, a philosophy graduate. He feels ethics are largely synonymous with “standards” in journalism; whereas a Sun reporter might be able to make stuff up, and their readers will expect it, someone on a broadsheet is meant to be offering a quality product. Also, ethics overlaps with “covering your arse” – your readers will include specialists on the topic, who are itching to write in and correct you. You will then have to explain carefully to your editor what you did to avoid representing the situation unfairly.

    Points out that since we no longer have a “man in the clouds” arbitrating our ethics, there is a certain amount of social realism necessarily involved. However, basic ethical goal is ensuring your work is “true, fair, balanced and accurate”; he did acknowledge that there are obvious value judgements involved there.

    I questioned what it meant to be “balanced”; in light of Wakefield, or ID/evolution. He said that in these sort of situations, of someone arguing “white against black”, all one can do is report the story; explain that this controversy has led to, say, an x-percent reduction in vaccine cover and how that can lead to an exponential increase in measles, but also the apparent risks according to Lancet, and interviews with, say, Wakefield, an MMR proponent, and perhaps pro- and anti-vaccine parents. He added that a side-bar including the risks of taking small studies at face value would be a good idea. Similarly for ID a simple reporting of the number of schools teaching the “controversy” rather than putting an angle on the story would be a good idea.

    On reporting any health issue an awareness of the emotiveness of the issue is important, and therefore he uses certain “filters” – only reporting on “cures” that have been published in the better class of peer-reviewed journal and are the product of decent clinical trials, rather than in-vitro or animal testing (although he admits that on the latter he has an element of bias as a vegetarian). Similarly, he will include appropriate caveats; as mentioned, Wakefield study/12 people thing. This is part of the broadsheet thing – the Mail, for example, “cures cancer every week”.

    A philosophy graduate, but worked for a long time in various nursing trade journals – Nursing Standard, Nursing Times – which has given him both an excellent contacts book and some internal “buzzers and alarms” for when a story sounds implausible or untrustworthy. However, he feels it would be arrogant of any journalist to assume they know what they are talking about; the contacts book is the first port of call. The Royal Colleges tend to be excellent, and the British Psychological Society has a very good database of research interests.

    He raised the issue that the Lancet and other (quality) peer-reviewed journals have a duty to vet what they publish, as their work is a lead point for mainstream journalism. The Wakefield thing in the Lancet was later admitted to have shoddy methodology and
    Wakefield himself may have had a conflict of interest, and this was not spotted in the original Lancet report. This bypasses journalists’ filters.

    He doesn’t use press releases as such; if, say, a drug manufacturer sends a press release, he will ignore it as there isn’t time or space to pay attention to all of them. However, when quality journals publish something, he will first read the news-wire release and then get the abstract and perhaps basic methodology off PubMed so that he can determine its reliability (ask questions like size of trial, type of trial, institution [South Wales Poly? Barts and the London?]).


    Questions so far

    October 4, 2006

    1.  What requirements should be placed on medical/scientific journalists to be educated in their field? Compare British to US scientific journalism.

    2. How should these – and other – requirements be put in place; voluntary regulation, legislation?

    3. Should ethical training be required for journalists?

    4. Should regulation be increased in other areas – i.e. declaration of interests, objectivity/fairness, separation of news and comment? Again, should regulation be self-imposed or legislated? What evidence is there of the effectiveness or otherwise of self-imposed (PCC) regulation, and what are the risks of legislation – might it draw the teeth of the media?

    5. What medical/scientific knowledge and understanding is it reasonable for a journalist to assume of his/her readership? Where is the line between helpful exposition and condescension? How much should it change between publications? Is it true that The Sun is deliberately written for a reading age of nine, and if so should it assume a similar level of expertise in its readers?

    6. Do journalists have a duty to their employers to sell papers? To what extent could sensationalism be justified by that? Or, rather – since it sounds tautological to say that “sensationalism” is a bad thing – when does justifiable volubility become sensationalism? Presumably if there is such a thing as a duty to their employers, then it will require “upselling” stories somewhat or the duty becomes meaningless.

    7. Perhaps the chief question: what is the line between reporting in the public interest and scaremongering?

    8. When should the scientific consensus view be taken at face value, and when should it be questioned? Scientifically untrained journalists should be wary of seeking out controversial views for “balance”, but I believe there are cases when the “establishment” view has turned out to be dangerously wrong – BSE? Asbestos? Were these views supported by the scientific community? A decent-sized section on Popperian scientific philosophy – what is certainty? Can it be achieved? Falsifiability, inductive reasoning, etc – would be interesting and informative.

     9. What responsibilities do scientists and doctors have in reporting their research to the media? Ben Goldacre, the Newton’s Apple Thinktank launch essays, badscience.net, 16th Oct 2006:

    Scientists and doctors, for example, can take care to be clear about the status and significance of their work when talking to journalists. Are the results preliminary? Have they been replicated? Have they been published? Do they differ from previous studies? Can you generalise, say, from your sample population to the general population, or from your animal model to humans? Are there other valid interpretations of your results? Have you been clear on what the data actually show, as opposed to your own speculation and interpretation? And so on.

    It is naive to imagine that such basic guidelines will be heeded by the irresponsible characters on the fringes who produce so much media coverage. However, they do represent best practice, and so they are always worth reiterating: they deserve to be incorporated into codes of practice from professional bodies and research funding bodies.

    “Scientists and doctors would also be well advised to take some even simpler steps: to think through the possible implications of their work, inform interested parties before publication, and seek advice from colleagues and press officers. This advice and more is all covered in the Royal Society’s excellent Guidelines on Science and Health Communication, published in 2001 [3].

    Journals, too, can take a lead, since they often produce the promotional material for research. Risk communication is a key area here, and although it is tempting to present risk increases, and indeed benefits, using the largest single number available (the “relative risk increase”) it is also useful to give the “natural frequency”. This figure has context built-in and is more intuitively understandable: it is the difference between ibuprofen causing “a 24 per cent increase in heart attacks” (the relative risk increase) and “one extra heart attack in every 1,005 people taking it”.

    10. Does freely available, peer-reviewed scientific literature – which should in theory remove concerns about conflicts of interest on the part of the writers – confuse journalists more used to hunting out hidden agendas? A common theme appears to be “this researcher has in the past received funding from Drug Company A; therefore we should distrust his research purporting to show the efficacy of Drug Company A’s new product”. Is this unfair or is there reason to doubt it? Goldacre: http://www.badscience.net/?p=251 “…over the past few years there have been numerous systematic reviews showing that studies funded by the pharmaceutical industry are several times more likely to show favourable results than studies funded by independent sources”. Sinister?

    11. Further to the drug company thing – how widespread is the practice of using charities and friendly media outlets to sidestep advertising standards rules? Who has what responsibility where in that scenario?

    12. How should I decide which articles to use? Presumably won’t be able to use all of them. Is there a way of randomly selecting them? Should I deliberately choose the most distorted pieces or am I then guilty of distortion myself? Anyway. Will need to establish a selection process. Could simply be “the ones that interest me”, I suppose – they don’t have to be representative of a paper’s editorial stance, since the existence of an unsound piece is A Bad Thing all on its own. Hmm. Seek advice.

    13. From Dad: [There are things like] glue sniffing which by consensus just don’t get a mention in the press in an attempt to prevent young people giving it a try. This made us wonder if you should have a chapter on examples of formal and informal agreements like this and their effectiveness.

    P-value/Error bar – let’s face it will need overall scientific skills refresher

     Interviewing techniques

    Archive stuff – particularly news archives (Lexis Nexis? Someone who knows what that is would be handy to know)