Software Carpentry

Helping scientists make better software since 1997

Archive for March 2008

Meet the New Flaw

I was pretty excited when I heard that Microsoft was getting into scientific computing. As the world’s biggest desktop software company, I figured they might understand that scientific computing and high-performance computing are not automatically the same thing, and that reliability and reproducibility are more important than peak performance. Turns out I was wrong: the workshop I attended last September was dominated by discussion of topics like GPU programming and computational grids that are still bleeding-edge computer science, rather than the nuts and bolts that would actually help most scientists be productive day-to-day, Microsoft’s new HPC++ Computational Finance lab‘s site has a lot on speed but nothing on correctness, et cetera.

So where should they be spending their time? If I ran the world, they’d start by reading Buckheit and Donoho on reproducible research, double back to Jon Claerbout’s notes on the same, check out the Madagascar project, and then try to figure out how to scale up those ideas to hundreds of thousands of scientists and publications in as diverse a range of fields as possible. It won’t give the senator something to stand beside on opening day, but it’ll do science a lot more good.


Written by Greg Wilson

2008/03/31 at 14:39

Posted in Opinion

Nice Quote

An article about computational science in a scientific publication is not the scholarship itself, it is merely advertising of the scholarship. The actual scholarship is the complete software development environment and the complete set of instructions which generated the figures.

David L. Donoho, WaveLab and Reproducible Research, 1995, p. 5.

(via Andrew Lumsdaine)

Written by Greg Wilson

2008/03/26 at 08:37

Posted in Community, Content

Survey: Silent Errors in Scientific Code

Posted on behalf of Daniel Hook and Diane Kelly:

We are members of a software research group from Queen’s University who are investigating ideas and tools to assist with the development of scientific software. We are starting a project focused on finding silent or hidden errors in scientific code. (Silent errors are errors that don’t result in a crash, an error message or an other obvious indicator of a problem.) To create a catalogue of common silent errors, we would like to hear debugging “war stories” from computational scientists. Using these stories we hope to provide improved code testing techniques specifically for scientists.

To conduct our study, we are looking for scientific software debugging stories: just a few lines explaining what the problem was and how you managed to solve it. You can contribute to our study by sending us a story and/or by passing this email along to colleagues who might be able to help with a story.

Please navigate to for more details if you are interested in contributing an error story to this study.

Thank you for your interest.

Daniel Hook and Diane Kelly

Written by Greg Wilson

2008/03/07 at 21:32

Posted in Community, Research

LearnHub Launches with Software Carpentry Front and Center

Toronto-area startup has launched their new e-learning site with part of the Software Carpentry course as their featured offering.  I’m looking forward to meeting a whole new batch of students… 😉

Written by Greg Wilson

2008/03/06 at 06:24

Posted in LearnHub