Final Project - Classifying Abstract Emotion
|

Introduction
For our final assignment, we were asked to think about how we could represent something computationally, and why it was important to use computation instead of another medium to represent that thing.

I saw this sketch on Open Processing by InfinitiumPrestige2024. I was really interested in this kind of real-time classification and wanted to see how I could use it to think about abstract human concepts, like love, instead of objects like “dog” or “car”. 


I emailed Mimi about my idea, and she told me to think about how I could use this idea to comment on the model itself, using the article we read in class, “But Do They Really Understand?”. The article is about the process through which Natural Language Processing (NPL) models mimic human understanding. Once my idea was set, I started to brainstorm methods of showcasing it.

Process
Footage
I needed to find the right video footage to represent my idea. I started to research and found the Internet Archive, which has media ranging from the 1950s to the present. I found home video footage from California via the Prelinger Archives and made sure to select clips of couples, families, and large groups of friends. I wanted to feed the model with the logic of using people together to classify love and people who were alone as loneliness, aiming to emphasize the point of mimicry as opposed to real understanding.


Training the Model

Once I had my footage, I started to think about ways I could train the model. I knew I could use Teachable Machine, but I wanted to do something more specific and comprehensive. I made a plan to create a sketch file that would be used to train the ml5.js Image Classification model using frames from the video. I wrote code to play and pause the sketch, and classify different bounding boxes that I drew with a mouse.


This didn’t work for a number of reasons; I wanted bounding boxes tracked to the movements of the figures I drew them for. I encountered several errors in creating the code, where the model wasn’t able to save my recordings of “love” or “loneliness”. I wasn’t able to figure this code out, and really want to return to it in a continued exploration of this idea.

I switched to Teachable Machine and downloaded my footage as images. I sorted them manually into folders of either love or loneliness. 

Next, I needed to write code for the video I made to play and display the confidence levels of the classes “love” and “loneliness” in time with the video. I wasn’t able to do this either. I used tools like Cursor and ChatGPT to try to figure out this error but wasn’t able to understand why it didn’t work. I think it had something to do with the frame rate and rate of classification not being able to match, but I need to explore the issue further to figure it out.

Result
My final result was this sketch. Since my attempts at the video didn’t work, I ended up choosing to cycle through different images with a key press and using the training data to classify itself. This recursive method still allowed for the commentary I was trying to make while maintaining the aesthetic that I felt best portrayed my idea.


Conclusion & Further Considerations
After showing my work in class and deliberating on it further, I definitely want to continue exploring this concept. I received feedback to use different data for training the model, like an indie video art piece about a couple talking for hours (Thanks Mimi). In the future, I want to properly train this model using the ml5.js Image Classification, or with another kind of image classification like the COCO-SSD classifier. I’m also interested in training my own neural network with the ml5.js library to do this.

Thinking into the far future, I think it would be interesting to also use facial expressions and body language to further develop this model. I’m also curious to see how the model responds when trained with sets of data from different cultures.

This concept is one I’d love to continue exploring in the future, and I’m curious to see what more I can get out of it!