At the ISEA conference in Istanbul, Turkey in September, I sat on a panel with the team behind serenA, a cross-disciplinary, multi-institutional research group currently working on an EPSRC-funded sandbox project to build a technology to produce serendipity.
I am skeptical of computer solutions that portend to generate serendipitous outcomes, as I widely discuss on The Serendipity Engine tumblog (the home of my research with Kat Jungnickel which recently received a small grant from the BSA). The good news is that they are too.
Here’s the talk I gave at the panel:
Beautiful Machines: A sense-making methodology for serendipitous discovery
There has been extensive research exploring how to build digital technologies that can mimic human processes like learning and logic, but this paper takes as its starting point that the phenomenological experience of being human is not replicable in the digital environment. Simply, unlike the capacity to learn – which can be measured – social and psychological concepts that are difficult to measure, like serendipity, are too multifaceted to be programmed in a way that can scale for the current capacity of computer processors. Services that claim to generate serendipitous outcomes may produce something that mimics coincidence, but the attribution of serendipity is inherently too complex for current digital technologies to discern.
defining the problem
The problem is one of definition, and how to adequately represent this: such phenomena are socially, culturally, temporally, materially defined, thus databases that seek to reduce it by collecting data points and applying heuristic algorithms would expand exponentially if they tried to document as much as necessary to accurately replicate the serendipitous effect. Common constituents shared between people do help to locate an individual within certain parameters – country, demography, religious or political orientation, IQ, etc – but proportions of the components are unique to each person. Measuring the being human thus becomes a philosophical problem rather than a technological one.
As we have discovered in the previous talks on this panel, serendipity is difficult to define because it relies upon the individual’s sense-making of a “happy accident” or the unexpected confluence of worlds (trickster). The current crop of “digital serendipity solutions” produce an avatar of “serendipity” based on discovery mechanism heuristics that do not account for the various environmental and individual elements that contribute to that attribution. Certainly the results might “pass” as serendipity, and this may be the desired effect for entities who wish to invest in this area, but their solutions are neither serendipitous nor scientific because the inputs they carer for pre-suppose a self-fulfilling – and non-accidental – outcome.
Instead, a complex concept like serendipity can be more holistically represented and understood in a way that allows for what Law (2004) describes as “messiness,” or the array of elements that are difficult to reduce or to conventionally order or organise (Jungnickel, 2010).
There are methods that seek to capture the messiness of human experience, particularly in the social sciences; for example, qualitative research analytic techniques in the psychological sciences, like Grounded Theory and Interpretative Phenomenological Analysis, seek to understand the experience of the individual, and ethnographic work, primarily used in sociology and anthropology, explore the sense-making processes employed by groups by examining texts and objects produced by participants in social systems. Yet even these methods fall short of being able to convey the outcomes, particularly because of the agendas and ideologies imbued in the written word (Sellen & Harper, 2001; Krotoski, in press) as the resulting sense-making by the researcher(s) is commonly reduced into text-based interpretations that are static.
how to “make sense”
An alternative method, practiced by Jungnickel and McHardy (2010) explores dynamic systems of sense-making by producing physical interpretations of data and theory, encouraging an ongoing, collaborative and reflexive understanding of the aspects of experience we are unable to measure or control. They are creating a series of “Enquiry Machines” that are an assemblage of human, non-human and conceptual components aimed at, “drawing attention to the material nature of enquiry… an attempt to critically examine practice…a metaphor in action.”
The function of performative methods that “make real” are multifold, serving the researchers’ sense-making and demonstrating the viewers and users the conceptual complexity involved in the social phenomenon under consideration. The processes which Jungnickel documents in her ethnographic performative practices, including the development of Enquiry Machine No 1, “open[s] up alternate means of interrogating [her] ideas.” She describes the reflexivity for the researcher involved in the choice of materials, the spatial configurations and the locations of display and the “messy, dynamic and collaboratively produced” evolution of the meaning of the research objects (including text, video, imagery, physical object) as viewers make “informal and serendipitous connections” and transform the display.
making sense for them
The sense-making by individuals towards the content is also represented in other machines that have had research and artistic agendas. The MONIAC computer was an analogue computer devised and built by a sociology graduate with an interest in economic principles as a demonstration tool for classrooms, and which became a predictive tool for governments and commercial organisations.
Created in 1949 by William Phillips in his spare time, the MONIAC is approximately 2 metres high, 1 metre wide and 1 metre deep. Its components include a series of found transparent plastic tanks and pipes affixed to a wooden board that contained different-coloured water that flowed between tanks to represent economic principles and the setting for economic parameters like tax and investment rates. It made visible the hidden inter-relationships between variables. Phillips’ computer was his method for translating the language of economics into hydraulics, “from a language he didn’t understand into a language which he did.”
The computer succeeded in its objective: “It was such a supreme visual telling of the mechanics of the Keynsian idea that I think all students felt for the first time they started to understand what the basic ideas were all about.” Dr Brian Henry, Director of the Centre for the Centre for International Macroeconomics, Oxford University
making sense for her
Artist Karen Green’s series of prints of fictional machines represents another kind of sense-making different from Phillips’, Jungnickel’s and McHardy’s: their objects explained “objective” principles by seeking to unpick the principles according to theory, while Green’s machines represent a sense-making that is subjective and intended to represent a personal working-through of a concept. The Forgiveness Machine was developed in response to her husband’s suicide and the need to process that event. ““The forgiveness machine was seven-feet long,” she says, “with lots of weird plastic bits and pieces. Heavy as hell.” The idea was that you wrote down the thing that you wanted to forgive, or to be forgiven for, and a vacuum sucked your piece of paper in one end. At the other it was shredded, and hey presto.” When physically re-created and put on display in an art gallery, the reaction had qualities similar to the machines previously described, instilling a sense of connection with the principles on display (emotional, conceptual).
The machine was overwhelmed, too; it couldn’t process all the requests and was eventually dismantled. “Forgiving is never as easy as we would like,” she says. “Apparently quite a lot of people cried.”
- The Guardian, 11 April 2011
making sense of serendipity
The Serendipity Engine, a collaboration with Jungnickel, is the second in the Enquiry Machine series, and similarly seeks to render visible the multiple socio-material-technical (and I would add, psychological) processes that are invested in the attribution of serendipity. It is an “engine” rather than a machine because, although it has a form and a function with a purpose – to render visible the cultural, temporal and material processes in generating a serendipitous outcome – it will transform the essential properties of the input. It will irrevocably convert the ins and transform them into brand new outs that serve the function, purpose and performance of the machine.
It will change the nature of chance by inspiring insight in order that value is attributed to the experience.
The “engine” will have three forms:
1) Jungnickel and I will develop a physical “core” that is based upon the current body of research that seeks to define the socio-cultural, psychological, technological and material nature of serendipity. This will be a dynamic and interactive entity made from scrounged materials, curated to represent “mess”. Our interpretation will focus on the “insight” element of the definition evolved by the SerenA project and other research in this area.
This will represent the social and psychological discourse upon which interpretations of value will subsequently be added.
2) To emphasise the locativity of the definition, we will seek “apps” from practitioners in different countries that can be appended to the core engine. These will demonstrate to users that the interpretations of serendipity are locally oriented.
3) There will be a series of open salon events in which commissioned artists, researchers and others will be asked to create a perforative piece or lecture for a public audience that is based on their interpretation of the theme “serendipity”. The objective is to “generate” serendipitous encounters between attendees across four dimensions: inspired by the interaction between audience member and object, between audience member and presenter, audience members inspired by object, and audience members not inspired by object.
This form is closest to the solutions designed by engineers, subject to the curatorial filtering and considered placement of “inputs” (commissioned presenters) in order to produce output.
As a technology, the Serendipity Engine seeks to insert messiness into the process of defining a value-laden human phenomenon. Through the process of forming and re-forming this analogue “solution” for a problem that is increasingly gaining value amongst commercial software developers, the aim is to render visible other factors that could produce more inclusive digital technologies that better-represent being human in code.
Jungnickel, K. (2010) Exhibiting Ethnographic Knowledge: Making sociology about makers of technology. Street Signs; Centre for Urban and Community Research, Spring, London: Goldsmiths, 28-31.
Jungnickel, K. & McHardy, J. (2010) Enquiry Machine #1: Performing (im)possible futures. European Association for the Study of Science & Technology conference (EASST), University of Trento, Italy, Sept 2.
Krotoski, A. (in press). Wikileaks and the New, Transparent World Order. The Political Quarterly.
Law, J. (2004). After Method: Mess in Social Science Research, UK: Routledge.
Sellen, A and Harper, R. (2002). The Myth of the Paperless Office. MIT: Boston, MA.