Software Carpentry

Helping scientists make better software since 1997

Sign Error => Five Papers Retracted

Via Genome Biology (free registration for trial access required), news that scientists from the Scripps Institute have to retract five papers published in various prestigious journals because of a sign error in a computer program. As Gregory Petsko says in the article:

Their mistake has consequences beyond the damage to the unfortunate young investigator and his team. For five years, other labs have been interpreting their biophysical and biochemical data in terms of the wrong structures. A number of scientists have been unable to publish their results because they seemed to contradict the published X-ray structures. I personally know of at least one investigator whose grant application was turned down for funding because his biochemical data did not agree with the structures. One could argue that an entire sub-field has been held back for years…

If I was a twenty-something working toward my PhD, I’d be thinking very hard about how I was going to validate the programs I was writing—the odds are growing steadily that journal editors and granting agencies are going to start demanding some sort of due diligence, sooner rather than later.

Advertisements

Written by Greg Wilson

2007/03/19 at 14:46

Posted in Opinion

5 Responses

Subscribe to comments with RSS.

  1. I’m not fully doing TDD for the analysis code I’m writing in my PhD program, but I’m doing a fair amount of automated unit testing. I’m not sure that the software development done in the sciences is totally up to “modern” development practices. For example, I’m the only one in my research group to use unit tests and source control. I’m also one of two people in our group (of about 10) that is *not* using fortran (not that there is necessarily anything wrong with it, I just don’t care for the language).

    Brian

    2007/03/19 at 18:57

  2. Not having been exposed to having to publish in journals, I would have thought that part of the purpose of these journals is that the the experiments and results where checked before publication. Obviously this was not the case. This natually leads into the discussion of the fate of journals of this nature and their future when information can be diseminated so cheaply on the web. Oh and a surprising amount of software done by software professionals is also not totally up to “modern” development practices either.

    adam

    2007/03/19 at 23:20

  3. It’s fairly hard for journals to verify computational experiments, given that source code is often jealously guarded, rather than published, in the scientific world. I guess this is mainly due to the cut-throat grant-application process, and the constant fear that other teams may publish work based on your efforts before you do (again, publications == money). It reminds me of the old anecdote: “Why are academic fights so ferocious?” … “Because the stakes are so low.”

    Stefan

    2007/03/24 at 20:11

  4. […] Once again, there’s no mention of making sure the programs actually work—nothing about testing, nothing about tracking results so that when a bug does appear you know what you should retract, nothing. I’m sure the organizers would say, “Oh, that’s part of accuracy,” but I’ve been part of enough discussions to know that when numerical scientists say “accuracy and robustness”, they’re talking about algorithms, not about coding bugs. Given stories like this one, it’s a revealing oversight. […]

  5. […] Five months later, scientists from the Scripps Institute had to retract five papers published in various prestigious journals because of a sign error in a computer program. Stories like these are making the course easier to sell… […]


Comments are closed.

%d bloggers like this: