Portfolio
Emotion Recognition
This project was an academic assessment focusing on how human interaction has been mediated by technology during the COVID-19 pandemic. The inaugural Goldsmiths project tasked students with experimenting with their interests over twelve weeks. Over the term, different human-computer interaction (HCI) relationships were examined. The first relationship was a computer mediating human interaction by sharing recorded dialogue with a random selection of individuals. The second relationship was a human mediating computer interaction by creating an infinite AI loop between Apple's Siri and Amazon's Alexa. The third test then examined the relationship between linguistics and technology within service announcements and interpreting where the spoken word exists in a given space. These arguably unsuccessful pathways forced the reexamination of the project as a whole through a set of interviews, questioning our relationship with technology.
The interviews highlighted a widespread response that technology is arguably unauthentic. One interviewee pointed out that conversation over video chat platforms is lonely with the device enacting social barriers. These social barriers include a lack of emotion-sensing. Interaction over video chat platforms is unable to mirror in-person interactions where users are successfully able to read body language and create a genuine social connection.
The online classroom was a prominent setting where this issue was prevalent. Students can hide figuratively and literally by turning off cameras and microphones leaving other participants unable to build genuine social connections.
How can you make technology more authentic? Would the addition of emotion and emotion-sensing create a more authentic interaction? Existing methods that have been tested include skin conductance and eye movement to improve connection and concentration, whilst other types of sensing such as taste and smell have been theorised amongst the academic community.
​
Claim: Our relationship with technology is always emotional, i.e. social media.
How do you show it?
​
Your face
​
I began by examining the 19th Century case study of Cesare Lombroso's criminal anthropology, where overlapped film negatives of convicted criminals created ghost-like meshes of faces that categorised suspects by their resemblance. I was curious whether the same technique could apply to facial expressions. According to the psychological paper by Hupont et al. 2010, there are six categorical emotions: happiness, sadness, fear, anger, disgust and surprise. Using this theory, I took photographs of 12 subjects displaying the set emotions and overlapped the images to create a ghost-like single face similar to Cesare Lombroso's study.
Facial emotions were determined using the distances and angles between specified points on the face. These then determined the rough ratios which would account for a classified emotion.
Next, I wanted to see if I could apply this data to a mundane scenario. I recorded a one-on-one tutorial with my studio tutor, attaching a facial mesh to my face that announced a detected emotion once the lengths and angles were similar to a given emotion's ratio.
​
The final video uses my emotion recognition data to create a more authentic interaction across video conferencing.
A recognisable flaw in the project is the disregard for ethnic diversity in the subject database, as different cultures could have different facial lengths and angles for specific emotions.
​
The future steps for the project are a live test with emotions recognised in real-time and a social media emotion track with facial expressions recorded in parallel with recorded social media usage. These applications would create notifications on a user's device to mediate screen time or showcase content based on displayed emotion.
​
This project tested my design research and interaction design abilities. After Adobe After Effects practice and user research, I constructed the tracking facial mesh and the final video. This project challenged me to experiment and learn new design software which would greatly increase the visual impact of my project.
Interaction Designer: Benjamin Jeffries
Studio Tutor: Sarah Pennington