top of page
  • connectedempathyin

Project Proposal

Updated: Nov 25, 2018

Mini-Abstract

We will be exploring computational communication of and interpretation of emotion. In particular, how computation can induce, or measure, an experience in an individual that differs from what we can traditionally convey, or understand, with our own physiology.




Introduction

Our research project will be centred around the emotional impact of audio and visual stimuli on individuals, and ways in which this emotional engagement can be interpreted computationally. In doing so, we will explore the relationship between empathy and emotional response, with the assumption that higher levels of empathy to a certain subject matter will produce a stronger emotional response. We will present users with sets of stimuli - images from across the arts; sounds; photographs; videos - that we believe each embody particular emotions, and quantitatively measure the user’s own emotional responses to these.


We will use a suite of research methodologies (direct data measurement, user studies, theoretical reading and interviews with artists) to decide on a particularly interesting example of emotion, such as fear, hope or humour, and create a visualisation of the data we have collected in relation to that specific emotion. We will go on to forge theoretical links between strength of emotion and individual empathy to a subject, based on users’ personal experiences. We will finish by opining on how this theory around individual empathy fits in the wider context of society, for example in healthcare, policy making and community cohesion, and asking, ‘At what distance does empathy end?’


Research Background

Art is something that is created through the need for emotional expression, and also something that itself induces emotions: most would agree that these two features exist in symbiosis. Contemporary art theory is rich with exploration of emotional response, but interestingly, despite the universal and age-old presence of art, emotion as a phenomenon was not studied scientifically until the 1970s (with Daniel Berlyne as a pioneer), and not combined with art theory until even later (Silvia, 2005). In the present day though, as the remit of computational art has expanded, so has the theory examining human responses to this. At the forefront of such discussions is feminist technoscience, one of the features of which is the recognition that empathy and emotion are crucial and productive aspects of human experience, not by-products to be dismissed or minimised. This is a very brief summary of our understanding of the field, and we will now discuss our individual journeys towards this focus, as we think it interesting that this specific topic has proven the point of overlap for quite different background research:


Hazel: I came to this project from a concern for the social and societal impacts of both art and technology. Specifically, I was interested in how these areas of human endeavour both have a potential to challenge perceptions and contribute to progressive change in society, but instead can perpetuate and condone existing cultural norms. This is dependent on who art or technology are being created by; how; and to what ends (Papadimitriou 2018). I began by thinking about randomness, errors and accidents in design; technological and socio-political progress made in a haphazard, unintentional, or poorly-understood way (Casey 1998). However a huge component of this process of change is the emotional response which is elicited in the user of a technology or the audience of an artwork, making it more likely to be considered important, and accepted in the canon of progress. Therefore it made sense to shift my focus to more personal human experiences of emotion in the context of art and computation, and how these can be tracked, manipulated, or induced, deliberately or unintentionally. I have been particularly fascinated by the wide remit of Elizabeth A. Wilson’s book Affect and Artificial Intelligence, which explores emotion and empathy both in the machine and in the user of technology (Wilson 2010) - in ways which go beyond the stated focus on AI.


Emma: I wanted to study the emotionally connection between humans, and how technology can be used as an extension to the emotion we can understand and also display. For example, humans mainly convey emotion through facial movements, gestation and tone of voice, but can technology be used as an extension to this to broaden our range of understanding for different emotional contexts? It is interesting to me that the quantity of empathy a human displays to another person or object is directly related to their understanding of the world and life experiences up to that point, highlighting how empathy is not a constant and universal experience and differs amongst individuals. So far in the project I have researched into existing computational arts projects that seek to externalise an individual's emotions, for example embarrassment or shyness in Embarrassed Robots (2017) by Soomi Park. From the perspective of inducing emotions in others, I particularly liked the audiovisual performance Contact by Felix Faire, which synthesized sound from everyday objects to create an immersive experience.


Christina: I am attracted to this study because of my consistent questioning on how computational procedures can interact and influence someone’s perception. Moreover, how emotions can be emerged through the observation of contemporary art, computer generated or not. Considering the evolution of art and in particular sound art, moving across the fields of sound design to electroacoustic compositions, I have always been wondering how contemporary compositional procedures which involve the use of technology can provoke emotional responses. Are these emotional responses stronger of wicker in comparison with traditional forms like western orchestral music? This idea has been also got influenced but the existent project of Coutinho and Cangelosi, who made an extended study in measuring and examining emotions produce while listening to classical music (Coutinho and Cangelosi, 2011).

In addition to that, it worth examining how this new computational tools we have nowadays in our posses can take part in the process of answering this questions by helping to clarify and identify the existence of emotional response and the emergence of empathy for the context of an observed artefact. Therefore, it would really beneficial to examine these thoughts from the prospect of art, in general, (including visual) which would be the goal of this project.

Research Methodology

We want to investigate why individuals find some things emotional and others less so. This may be due to the context they are displayed in, or the subject matter of the material, and we believe it relates strongly to individual closeness, or empathy, to the situation. To carry out this research, we have conducted reading of current theories around feminist technoscience and aesthetic perception - our bibliography gives a sample of those texts we have found most poignant. Ideas of kinesthetic empathy and simulation theory are of specific interest: authors in these areas have suggested that it is possible to experience empathy based only on observing the movements of another human being (Reynolds and Reason 2012), and that when we see someone experiencing an emotion, we simulate it (Freedberg and Gallese 2007). In addition, we have investigated artists who are working at the intersection of ‘synthesised’ and ‘natural’ artwork. In this context, ‘synthesised’ refers to computational, generative, and algorithmic pieces, whereas ‘natural’ would refer to traditional art mediums, such as painting or sculpture. To understand the creation process, we will interview artists to identify how they structure their practise. We wish to find out if these artists are actively concerned about the emotional impact they are producing through their art, or whether it is a side effect of their pieces.

After our initial research, we will conduct experiments to quantify the emotional experience a user has when interacting with different materials. This will give us data that will allow us to expand our understanding of emotional perception and empathetic response. We will then use this data to create an artefact that contributes a different viewpoint to the existing discussion around empathy, specifically related to that in the context of natural and synthesised art. In more detail, in our experiment will us GSR signals (Montagu and Coles, 1966) in order to measure the emotional response of the participants while they are exposed in selected imagery and auditory stimulus. The experiment will involve the following steps:

Select participants - the responses of a particular emotional individual. Maybe expanding to others to compare and contrast, depending on scope and on initial data findings.

Identify emotional material to display in the experiment - separate into categories by emotional content (e.g. sad images, hopeful images, etc.). Type of material can be from across the arts, and possibly beyond.

Construct the GSR device using an arduino

Create a questionnaire to collect users’ opinions after experiment: focus on both objective and subjective facets

Conduct physical experiments

Analyse raw data and design artefact

Project Timeline

16th November: Proposal deadline

16th November: Build emotional response tracker (GSR device) and test output

21st November: Peer assessment deadline

23rd November: Curate content of emotion inducing material (from across the arts and beyond)

26th November: Finalise questionnaire for study participants

30th November: Complete user participation studies

30th November: Complete artist interviews

7th December: Complete art form from user participation data

16th December: Complete write up and abstract

17th December: Deadline

Artefact Creation and Documentation

Our artefact will be influenced by the outcomes of both our experimental and research data. The qualitative research theories and artist interviews will inform the ideation of the artefact, whilst the quantitative metrics obtained from the GSR output and user interviews will make up the content of the artefact. We envisage creating a 3D-printed visualisation of emotional response data if this proves feasible, or otherwise will create a purely digital data visualisation using tools such as openFrameworks.

The budget for our project consists of the price of a Galvanic Skin Receptor to measure users’ responses to emotional stimuli (£20), plus the cost of materials for a potential 3D-printed prototypes (£5-£20).

Both our ongoing research and artefact design and creation will be documented on a website, using text and images. The website will archive all experiment data, artist interviews, and iterative prototypes.


Bibliography

Montagu, J. and Coles, E. (1966). Mechanism and measurement of the galvanic skin response. Psychological Bulletin, 65(5), pp.261-279.

Silvia, P. (2005). Emotional Responses to Art: From Collation and Arousal to Cognition and Emotion. Review of General Psychology, 9(4), pp.342-357.

Wilson, E. A. (2010). Affect & Artificial Intelligence. Seattle: University of Washington Press.

Papadimitriou, I. (2018). Beyond the Machine. Parsing Digital: Conversations in digital art by practitioners and curators, pp.67-77.

Reynolds, D., & Reason, M. (2012). Kinesthetic empathy in creative and cultural practices. Bristol: Intellect.

Coutinho, E. and Cangelosi, A. (2011). Musical emotions: Predicting second-by-second subjective feelings of emotion from low-level psychoacoustic features and physiological measurements. Emotion, 11(4), pp.921-937.

Freedberg, D. and Gallese, V. (2007). Motion, Emotion and Empathy in Esthetic Experience. Trends Cogn Sci., 11(5): pp.197-203.

Faire, F. (2013). CONTACT: Augmented Acoustics. http://felixfaire.com/project/contact

Casey, S. (1998). Set phasers on stun : And other true tales of design, technology, and human error (2nd ed.). Santa Barbara: Aegean.

6 views0 comments

Recent Posts

See All
bottom of page