Software is biased, so how can developers act in the best interest of their audience?

On Monday, The Nominet Trust published The Personal (Computer) Is Political, a provocation paper based my last three years research looking at philosophies and agendas built into in the software and web services we use every day. The report calls for consumers and creators to recognise that software is a cultural artefact – like film, television, architecture, comedy, food, art and design – and therefore it is part of the zeitgeist of its day. This includes the political, economic and social climate of where and when it was built.

As part of the report, I propose several sets of recommendations. Today, I’m highlighting the Recommendations For Developers.

Developers are the magicians who create the systems that help us navigate the web. As observer and critic Jaron Lanier explains, rarely do developers plan for world domination; rather, many of the decisions they make are short-term solutions to tackle a problem that has become embedded in the system almost accidentally (Lanier, 2010).
  • At all stages of the design and implementation process, developers must consider the implications of a design change, not against the digital identity that is limited by the capabilities of their service, but by the entire entity that is represented by the digital identity. A line in a database is a person, not just a line in a database. What appears to be a username or a tiny datapoint on their service actually represents a real person in all his social and psychological complexity. A minor change to someone’s digital identity may have much greater real-world implication that extends beyond the scope of the application.

  • The loss of empathy through abstraction also scales from individuals to groups. Relying on Big Data or data-driven design amplifies the hegemony of the average. Using Big Data won’t enable designers to find a design solution that renders the first recommendation unnecessary. The humanness is found in the edge cases.

  • Be aware of the language that you use to describe a social action on your service, ensuring that the user does not confuse that action with another of the same name but that has different social implications in another context. For example, ‘Friending’ or ‘Liking’ have different meanings and social implications in both online and offline contexts. Users should be educated as to precisely what you mean by your choice of language.

  • Designers must be aware of assumed digital literacy. This isn’t just about knowing how to use a mouse, a keyboard, a touchscreen or the web: this includes what is and what isn’t public, how long something is there for, who can reply to something, how easily the record of an interaction can be deleted, how easily something can be attributed to a real-world identity, who owns the results of any interaction. This is a wider subset to privacy issues; it’s the understanding of social boundaries.

  • Human suffering outweighs programming effort. It’s the duty of the software to become complex, not the duty of the human to become simple.

Read the whole report here.