|Download the pdf|
This investigation explores relations between 1) a theory of human cognition, called Embodied Cognition, 2) the design of interactive systems and 3) the practice of ‘creative group meetings’ (of which the so-called ‘brainstorm’ is perhaps the best-known example). The investigation is one of Research-through-Design (Overbeeke et al., 2006). This means that, together with students and external stakeholders, I designed two interactive prototypes. Both systems contain a ‘mix’ of both physical and digital forms. Both are designed to be tools in creative meeting sessions, or brainstorms. The tools are meant to form a natural, element in the physical meeting space. The function of these devices is to support the formation of shared insight: that is, the tools should support the process by which participants together, during the activity, get a better grip on the design challenge that they are faced with. Over a series of iterations I reflected on the design process and outcome, and investigated how users interacted with the prototypes.
In creative meetings, participants do not always have a clear understanding of their creative challenge right from the start. Part of the problem is that each participant may understand the challenge, as it is initially introduced by the problem-owner, differently. In general, many creative challenges are complex, ill-defined problems to begin with (aka ‘wicked problems’). Especially when multiple stakeholders are involved, each with their own interpretation of what the challenge is ‘really’ all about, a better insight into ‘what the problem really is’ needs some time to evolve. In practice, shared insight into the creative challenge co-evolves with the team’s practical activities towards addressing the challenge. In current practice, people use all kinds of physical tools to develop this shared insight, such as: sticky-notes, whiteboard and markers, sketching paper, prototyping materials, photographs brought from home, and so on. The question is whether we can augment these physical tools using interactive technology in a meaningful way, without disrupting the creative flow, or the natural, improvised and flexible ways by which people currently interact with each other and their physical artifacts.
A theory that might be helpful for designing such integrated technological tools, as part of the creative space, might be Embodied Cognition. This is a theory about the basic way in which people are able to think and act. It claims that what we call cognition is fundamentally dependent on our ongoing embodied activity. Bluntly speaking: no active body, no thoughts. The theory argues against the classic idea that thinking is something that happens purely ‘internal’ to us. It rejects the idea that the mind is the ‘software’ running on the ‘hardware’ of the brain. Thinking, instead, emerges in action, out of continuous embodied interactions between brain, our body and the way our body is ‘situated’ in a physical- and social context. According to Embodied Cognition, cognition is best seen as a dynamic coupling (Clark, 1997; Dourish, 2001), or a process of coordination (Suchman, 2007; Clancey, 1997), or, as phenomenologists call it, as getting ‘grip’, through skilled action (Dreyfus, 2002; Merleau-Ponty, 1963). Artifacts, such as the sticky-note in a brainstorm session, play an important part in the embodied cognitive process. As Kirsh (2010) states, physical artifacts are ‘things to think with’. They help people to ‘offload’ thinking to the environment, to coordinate their own activities with those of others, and to create and hold active online couplings in the continuous feedback loop between action and perception.
One reason Embodied Cognition may be useful is because the field of interactive systems design shows a growing trend towards trying to integrate physical form and digital process. Looking at it from the perspective of Industrial Design, this entails adding interactive behaviors to physical products, using sensors, actuators, and the like. From the perspective of computer science, it means creating so-called ‘tangible’ interfaces, where various physical objects can be used to control digital information, a follow-up on the familiar ‘graphical’ interface. A related trend is that of ‘contextual’ interfaces that depend crucially on available cues in the local environment. These developments go under such headings as ubiquitous computing, tangible interaction, wearable computing, augmented reality, and so on. The bulk of this design work has yet to find its way to the commercial markets, but the prospect is that in the near future this will certainly happen. The current popularity of interacting via mobile devices, such as the smartphone or tablet, with Apps making use of GPS location, the accelerometer, and so on, signals a development that moves away from the classical ‘desktop interface’. One may say it moves interaction with digital processes ‘back into the real world’, mixing it seamlessly with physical objects, environments, and social contexts.
Through designing and researching two concrete interactive prototypes I explored the following research question:
RQ 1. How may we design interactive systems in support of embodied cognition?
The main objective of my investigation has been to reframe our conceptualization of interactive systems design, such that designers may start to think in new ways about what it is they are trying to do, based on Embodied theory. At the same time, the practical attempt of the theory to design also brings insight the theory itself, which defines my second research question:
RQ 2. How does (the practical attempt at) designing interactive systems supporting shared insight in creative meetings, inform the theory of embodied cognition?
In a series of design iterations I undertook the following activities: 1) detailed observational studies of naturally occurring human practices (either with our without our prototypes); 2) participatory workshops involving potential users from several creative companies and organizations, executed at the site-of-practice, including situated interviews and ‘acting-out’ design concepts; 3) design explorations and prototyping, and reflecting on these in reference to the theoretical framework; 4) detailed observations of human activity in response to an experimental manipulation, using two variations on a prototype as conditions, and 5) general theoretical reflections. Together, these research activities enabled me to answer my research questions.
NOOT, in its final form, consists of a system of tangible clips with which one can create time-markings in a continuous audio-recording of the creative session. The tangible clips can be placed everywhere in the physical space, e.g. on the wall, on the table, on sketches or annotations, or on a mock-up or prototype. With a physical horn one can activate audio-playback that allows one to listen to the part of the conversation that was going on when the clip was first activated. In this way NOOT provides small segments of ‘history’ of the conversation, attached to meaningful physical items in the space, which can be fed back into the current conversation. In the final reflection I offered that NOOT couples individual moments of reflection-in-action to the overall group conversation, thereby supporting the formation of shared insight.
FLOOR-IT enables people to create digital photographs of any of the sketches or written texts (or other visual elements) created during the session. The series of personal snap-shots form a ‘trace’, reflecting one’s evolving line-of-thought. Each person’s personal ‘trace’ is physically projected as a circle of digital images around the body, on the floor. On that floor, which is quite large (six projectors were used to create the canvas), small groups of people engage in a creative conversation, while their traces are publicly visible as projected around their bodies. The traces form a conversational ‘scaffold’, to which people can point and refer during the talk. Furthermore, by using foot gestures, images may copied from one trace to the other, and they may be combined to form new clusters, new traces, which stay fixed on the floor. The emerging, overall trace on the floor represents the growing ‘shared insight’ of the team as a whole which continues to support the ongoing interactions of the creative team. In a user study, comparing FLOOR-IT with a variation that projected the pictures on a shared wall, it was discovered that such traces function to help people position themselves socially in relation to others. Referring to ones personal trace during ongoing talk helps not so much to share factual information rather than that it serves to present yourself as a valuable partner in the activity, and to invite others to do so as well.
Embodied Cognition is a broad field of inquiry, with roots in quite disparate research traditions. Based on my reflections on the design iterations, I was able to discern four variations of the theory that each have their own particular consequences for design. I call these the 1) distributed representation and computation perspective, 2) the socially situated practice perspective and 3) the sensorimotor & enactment perspective.
The distributed representation and computation perspective is perhaps most easily understood by those familiar with computational principles and it has proven to be a useful and relevant set of principles for interaction designers. Yet it actually hinders interaction designers in getting to the heart of the notion of embodiment. Instead, based on my design investigations I offer that the prime ingredients needed for understanding how my prototypes support shared insight are 1) the sensorimotor aspect of cognition (how insight emerges from real-time coupling of perception and action) and 2) the social situatedness of cognition (how cognition is socially coordinated between people). Moreover, sensorimotor coupling and social situatedness are strongly integrated in one unified embodied activity (Goodwin, 2000). In particular, the studies revealed how people would create expressive traces in the environment. Expressive traces, e.g. a physical sticky-note, a NOOT clip, or a trace in FLOOR-IT, are both the outcome of people’s earlier actions, as well guiding further action. That is, traces become part of people’s sensorimotor couplings. At the same time, expressive traces are also social artifacts, created in and for a social context, publicly available and socially accountable. They function to coordinate people’s social positioning in the physical space. Expressive traces form the linking pins between social interaction and sensorimotor coupling, thereby supporting the emergence of shared insight.
I offer a number of pitfalls and opportunities for designers that want to ground design in embodied cognition theory. I start with the claim that the classic interface concepts, which rely on information processing metaphors, are best explained with a ‘Distributed Representation and Computation’ version of embodied cognition. This goes also for many of the so-called ‘tangible media systems’, where tangible objects essentially ‘encode’ digital information in physical form. However useful, they stand in the way of designing for a more fundamental form of Embodied Cognition, as they too easily draw us back into ‘Cartesian thought’.
Based on Socially Situated Practices and Sensorimotor Coupling and Enactment, I propose a more fundamental form of Embodied Cognition Design. Embodied Cognition Design brings forth interactive systems that transform our ways of perceiving, our possibilities for acting, our ways of interacting socially with others, and it helps us to create endurable ‘expressive traces’ in the environment. In any concrete product proposal, all of these aspects will be part of the unified experience of the user. Through Embodied Cognition Design we may search for completely new roles for digital computing technology in human practices. This means going beyond the classical, Cartesian functions of storing, processing and presenting representational data. One consequence of this vision is that the ‘function’ of an artifact can no longer be predefined before one starts designing the ‘interface’: in Embodied Cognition Design, concrete interactions between the user and the system bring forth, or ‘enact’ the meaning that the system has for the user. This means one has to design the interactive behavior and ‘what the system is for’, both at the same time, with no a priori distinction between the digital- and the physical aspect, nor even the embedding context. One may for example design in iterative fashion, building series of functioning prototypes, which can be tried out such as to stay in close contact with the user and his context of practice throughout the entire project.
TRACES 1 (aka Floor-it)
Publication list: http://www.jellevandijk.org/wp/publications/
Video impression of the designs: http://www.jellevandijk.org/wp/2011/11/05/concept-movies/
Workshop site Atlanta, nov ‘11: https://sites.google.com/site/cc11embodied/
(The web pages may show error messages, which can be ignored, and for which my sincere apologies; the software will be updated ASAP).