Software and Social Defaults

There are a number of ways in which software, and the process of producing software, can be ethical. There are matters of adherence to software engineering standards. (Ranty aside: Writing software is not engineering. Nor is it architecture. It’s a craft, analogous to very structured prose-writing. The success of a project is determined far less by the external ‘engineering’ standards followed, than by the skill and craftsmanship of the programmers. This will never be accepted by middle-managers of software projects, since that would force them to accept how little they’re actually in control.) There are matters of modularity — and therefore malleability; of testing rigour; of accessibility to the finished product.

I’m increasingly convinced that there’s an aspect of software ethics that’s being missed. It follows inevitably from the fact that so much commonly-used software these days serves a social function: e-mail, IM-ing, Usenet, blog management systems, even web-editing software. Social software implies a social ethics. I’m tempted to call it a social etiquette, but it’s far more than that, because the effects can be so significant. Unwittingly transmitting a virus, for example, isn’t a minor faux pas; it’s a kind of vandalism, a dereliction of social responsibility.

Who’s to blame for anti-social software use? Anyone who uses software anti-socially, of course, and whether they understand the issues or not. There’s an argument to be made — I’m not quite sure I’d make it myself, but it’s there — that some minimum demonstrable amount of ‘net-awareness ought to be required to gain access to the ‘net.

That’s all well and draconian, but it’s possible to accept that final responsibility for the anti-social use (or neglect) of software belongs to its user, whilst also recognising the responsibility that belongs to the designer of social software to assist with its responsible use. Some very simple principles can be of assistance here:

  1. Users of software are lazy, and will use whatever they’re given, in the form that it’s given to them, unless they have a powerful incentive to change it.
  2. Users of software are often fearful of technology and will typically not investigate features of that software beyond those which they absolutely need.
  3. An exception to 2. is that, if some feature strikes them as ‘cool’, and is presented in a form which is easy to enable, they will typically enable it without regard for any anti-social effects it might have.

So much — frustratingly, depressingly so — of presenting social software to users in a responsible manner is simply a matter of choosing the right defaults. Because of laziness and fear, users will typically not change defaults unless they have a powerful incentive to. And yet defaults within social software are so very often anti-social: e-mail clients which default to HTML (or multipart text/HTML) or rich-text formatting, with no real regard for how much unnecessary traffic they create, or how they interface with text-only clients; clients with enabled-by-default scripting which serves as a recipe for virus propagation; simplistic web-generation tools which default to methods supported by only a preferred browser; registration dialogues which default to less private settings; operating systems which place the ability to interact with their mother organisation behind the scenes above the user’s security.

There are so many poorly-chosen defaults, because there are so many reasons to choose defaults poorly: a desire to present the user with easily-accessible cool features, no matter how anti-social they might be; a desire to leverage market domination to change convention, and thereby gain control; hunger for greater possession of a user’s system. None of these is good for the society formed by users of social software. They create problems of portability, of privacy, of security, of bandwidth, of wasted time and resources. They affect all of us.

It would make me very happy to see a software industry which policed itself with regard to social software issues, and particularly with regard to socially responsible defaults. I realise that’s not about to happen soon. A nice alternative would be a seal of approval that could be awarded by some independent body to social software which meets a set of criteria. With huge off-the-top-of-my-head caveats, this sort of thing:

  1. If there exists a non-software convention for doing something that the software does, carry that convention forward to software as the default.
  2. If there already exists a convention in other software for doing something that the software does, use that convention as the default.
  3. Default to the simplest, lowest-tech options.
  4. Default to the options which generate the smallest ‘net traffic.
  5. Default to the options which maximise portability of any output from the software.
  6. If there are security and/or privacy issues for the software, default to the most secure and/or private settings.
  7. If options are provided which violate any of the above, provide enough information so that users are told, at the time of choosing them, what anti-social effects they might have.

Call it the Charter for Socially-Responsible Software Defaults. Identifying which software violates it, and how, is left as an exercise for the reader. You won’t need to look very hard.

No Comments

Leave a Reply

Your email address will not be published. Required fields are marked *