Software Carpentry

Helping scientists make better software since 1997

Archive for November 2008

Igor, Connect the Electrodes!

The Software Carpentry course site is still getting a fair bit of traffic, although readership is definitely tailing off:

SWC site stats

I’m hoping to run an intensive three-week version of the course in June 2009 in Toronto (details to follow); hope I can find time between now and then to finish wikifying the course notes, get the MATLAB material online, and generally freshen the site up.

Written by Greg Wilson

2008/11/30 at 15:27

Posted in Lectures, Version 3

SECSE’09 Call for Papers

Second International Workshop on Software Engineering for Computational Science and Engineering

Saturday, May 23, 2009

Co-located with ICSE 2009 – Vancouver, Canada

http://www.cs.ua.edu/~SECSE09

Overview

This workshop is concerned with the development of:

  • Scientific software applications, where the focus is on directly solving scientific problems. These applications include, but are not limited to, large parallel models/simulations of the physical world (high performance computing systems).
  • Applications that support scientific endeavors. Such applications include, but are not limited to, systems for managing and/or manipulating large amounts of data.

A particular software application might fit into both categories (for example, a weather forecasting system might both run climatology models and produce visualisations of big data sets) or just one (for example, nuclear simulations fit into the first category and laboratory information management software into the second). For brevity, we refer to both categories under the umbrella title of “Computational Science and Engineering (CS&E)”.

Despite its importance in our everyday lives, CS&E has historically attracted little attention from the software engineering community. Indeed, the development of CS&E software differs significantly from the development of business information systems, from which many of the software engineering best practices, tools and techniques have been drawn. These differences include, for example:

  • CS&E projects are often exploring unknown science, making it difficult to determine a concrete set of requirements a priori.
  • For the same reason, a test oracle may not exist (for example, the physical data needed to validate a simulation may not exist). The lack of an oracle clearly poses challenges to the development of a testing strategy.
  • The software development process for CS&E application development may differ profoundly from traditional software engineering processes. For example, one scientific computing workflow, dubbed the “lone researcher”, involves a single scientist developing a system to test a hypothesis. Once the system runs correctly and returns its results, the scientist has no further need of the system. This approach contrasts with more typical software engineering lifecycle models, in which the useful life of the software is expected to begin, not end, after the first correct execution.
  • CS&E applications often require more computing resources than are available on a typical workstation. Existing solutions for providing more computational resources (e.g., clusters, supercomputers, grids) can be difficult to use, resulting in additional software engineering challenges.
  • CS&E developers may have no formal knowledge of software engineering tools and techniques, and may be developing software in a very isolated fashion. For example, it is common for a single scientist in a lab to take on the (formal or informal) role of software developer and to have to rely solely on web resources to acquire the relevant development knowledge.

Recent endeavors to bring the software engineering and CS&E communities together include two special issues of IEEE Software (July/August 2008 and January 2009) and this current ICSE workshop series. The 2008 workshop [http://www.ua.edu/~SECSE08 ] brought together computational scientists, software engineering researchers and software developers to explore issues such as:

  • Those characteristics of CS&E which distinguish it from general business software development;
  • The different contexts in which CS&E developments take place;
  • The quality goals of CS&E;
  • How the perceived chasm between the CS&E and software engineering communities might be bridged.

This 2009 workshop will build on the results of the previous workshop.

Similar to the format of the 2008 workshop, in addition to presentation and discussion of the accepted position papers, significant time during the 2009 workshop will be devoted to the continuation of discussions from previous workshops and to general open discussion.

Submission Instructions

We encourage submission of position papers or statements of interest from members of the software engineering and CS&E communities. Position papers of at most eight pages are solicited to address issues including but not limited to:

  • Case studies of software development processes used in CS&E applications.
  • Measures of software development productivity appropriate to CS&E applications.
  • Lessons learned from the development of CS&E applications.
  • Software engineering metrics and tool support for CS&E applications.
  • The use of empirical studies to better understand the environment, tools, languages, and processes used in CS&E application development and how they might be improved.

The organizing committee hopes for participation from a broad range of stakeholders from across the software engineering, computational science/engineering, and grid computing communities. We especially encourage members of the CS&E application community to submit practical experience papers. Papers on related topics are also welcome. Please contact the organizers with any questions about the relevance of particular topics. Accepted position papers will appear in the ICSE workshop proceedings and appear in the IEEExplore Digital Library.

Please observe the following:

  1. Position papers should be at most 8 pages.
  2. Format your paper according to the ICSE 2009 paper guidelines.
  3. Submit your paper in PDF format to carver@cs.ua.edu.
  4. Deadline for submission: January 19, 2009
  5. Submission notification: February 6, 2009.

Organizing Committee:

  • Jeffrey Carver, University of Alabama, USA (chair of the organizing committee)
  • Steve Easterbrook, University of Toronto, Canada
  • Tom Epperly, Lawrence Livermore National Laboratory, USA
  • Michael Heroux, Sandia National Laboratories, USA
  • Lorin Hochstein, USC-ISI, USA
  • Diane Kelly, Royal Military College of Canada
  • Chris Morris, Daresbury Laboratory, UK
  • Judith Segal, The Open University, UK
  • Greg Wilson, University of Toronto, Canada

Written by Greg Wilson

2008/11/21 at 08:06

Posted in Community, Research

Getting the Science Right—Or At Least, Less Wrong

Via The Great Beyond:

The US National Academy of Sciences has created an initiative that will link TV and movie directors with scientists and engineers to incorporate more accurate science content into entertainment.

Press release here, web site here. That would be a cool job…

Written by Greg Wilson

2008/11/20 at 17:03

Posted in Noticed

Science Lessons for MPs

Via Nature: politicians from the UK Conservative Party will be required to take science lessons. On the one hand, kind of sad that they didn’t learn the basics in grade school.  On the other hand, yay!, and when will Canadian parties require the same?

Written by Greg Wilson

2008/11/17 at 09:57

Posted in Noticed

What Sciences Are There?

Over 1900 people have already responded to our survey of how scientists use computers, and it still has two weeks left to run. Our next task will be to analyze the data we’ve collected, which (among other things) means coding people’s free-form descriptions of their specialties so that we can talk about physicists and chemists as opposed to “this one person who’s doing N-brane quantum foam approximations to multiversal steady-state thingummies”.

Except: are “physics” and “chemistry” too broad?  At that level, there are only a handful of sciences: astronomy, geology, biology, mathematics, psychology, um, computing, er, Curly, Larry, and Moe.  Or maybe you’d distinguish “ecology” from “biology”.  Or “oceanography” from something else, or — you see the problem.  Rather than making up our own classification scheme, I’d like to adopt one that’s widely used and generally intelligible, but I’m having trouble finding one.  Yahoo!, Wikipedia, and other web sites have incompatible (and idiosyncratic) divisions; the Dewey Decimal System and other library schemes have a very 19th Century view of science, and the ACM/IEEE publication codes are domain-specific.

If anyone can point me at something else (ideally, something with about two dozen categories — that feels like it ought to be about right, just from eyeballing the data we have so far), I’d be grateful.

Written by Greg Wilson

2008/11/16 at 21:34

Posted in Community, Research

One Good Survey Deserves Another

While we’re running our survey of how scientists use computers, the folks at MATLAB are asking their users a few questions too.  If you use any MathWorks products, and have a few minutes, they’d be grateful for your help.

Written by Greg Wilson

2008/11/04 at 08:16

1731

1731 people have completed our survey of how scientists use computers since it went online three weeks ago.  That’s pretty cool, but I’d like to double the number (at least).  If you consider yourself a working scientist, and haven’t taken the survey yet, please take a moment and do so.  If you aren’t a scientist, but know some, please pass on the link:

http://softwareresearch.ca/seg/SCS/scientific-computing-survey.html

Thanks!

Written by Greg Wilson

2008/11/02 at 12:58

Posted in Community, Content, Research

Follow

Get every new post delivered to your Inbox.