Introduction
For our physical computing final, we decided to continue with the same concept as the midterm (insert link) project, but in an entirely new way. We met to solidify our goals and concept, to make sure we were all on the same page when moving forward creatively. This was our concept:
An audiovisual experience where individuals are encouraged to connect more deeply with their senses through movement. The abstracted kinetic visual expression of the installation invites inner self into the outer world. Our user experience is one where the dancer is able to feel synchronization between what they see, hear, and feel. Through achieving this mind-body-spirit harmony, users can relax and unplug from the hustle of daily life.
Brainstorming + Research
We were inspired by particle visualizations, and initially planned to use p5.js to create our visual. We found several inspiration images and used these as the foundation for our visual aesthetic.
When planning out the physical component, we knew that the midterm was a good starting point, but wasn’t quite as danceable as we wanted. The midterm project was a 40”x40” wooden board with six buttons placed around it in a circle. Each of these circles corresponded to a color change in the TouchDesigner visual we created. These buttons were fun to jump on, but presented two key issues: danceability/engagement, and accessibility. We knew that this project should (1) have a flat surface that actually mimics a dance floor and (2) have a low threshold for pressure to open the user base.
To accomplish both these goals, we decided to use velostat, a pressure-sensitive conductive material that feels like a film and is very thin. For the tactile experience, we wanted to use marley, the material found in dance studio floors. After looking into marley, it was out of budget for the project, and we decided to use vinyl to mimic the marley sensation. We originally planned to build these two on a wooden board, similar to the midterm.
Another key element of making the experience more about dancing was capturing upper body movement. We talked about the relationship of movement via intangible interaction to position on the mat, and how it should be represented in our visual. We decided to test different sensors to capture movement.
Iterations
First Prototype
Our first prototype was a small square, which we used to test different visuals as well as to learn about programming the velostat.We tested a couple of different visuals using particle sketches we created over 2 weeks. We tested with the velostat as well as time-of-flight sensors to try to discern an intuitive interaction for our users. The time-of-flight sensors manipulated color on the particles, while the position on the velostat manipulated the position of the particle system in the p5.js sketch.
When we tried to put everything together the night before our play test in class, everything broke. Working with two serial ports in p5.js proved to be extremely difficult and we lost our progress on the visual. We were also programming the velostat prototype incorrectly, and realized we needed to change our setup with the material. We consulted the o-mat instructable (link) and Nasif and Ines for help.
Unable to get our prototype working for the playtest, we started to reconsider p5.js as the platform for our visuals. We went back to our visual inspiration, where there were several links to TouchDesigner tutorials and visuals. I created a visual from one of these tutorials that mapped to audio data, and set up a “pretend” interaction for our playtest in class, where I received valuable feedback on the visual itself and how it should react to user interaction.
Fabrication
Understanding the logic of how to program the velostat, we moved on to large-scale fabrication. We created a 3’x3’ board with two foam boards, velostat sandwiched in between, and vinyl on top for the tactile feel. We laid 5 rows of electrodes with copper tape on each foam board, orienting them perpendicular to one another to get readings properly.
The way the board works is through one set of digital pins and one set of analog pins. The five rows on the bottom foam board are connected to digital pins 2-6. Each time the code is run (60 times per second), the Arduino sends a high voltage (+3.3V) through each row. The five columns on the top foam board are connected to analog pins, using pulse width modulation to send readings. The velostat in between reads the pressure and ensures conductivity between the two layers. When an intersection of one of the rows and columns is read, it sends a pressure value to the serial monitor.
We also laser cut a box for our time-of-flight sensors to go inside, planning on installing them right at the front to measure movement. In an early user test, however, we received feedback that the box made people feel like they had to be more careful so as not to hit the box while dancing.
Using this feedback, we explored possible ways to use a webcam to capture body movement. We used the MediaPipe by Google in TouchDesigner as the method for controlling position, and tested its interaction in our next formal round of user testing.
Programming
We understood the logic of how our board worked, but still needed to perfect our Arduino code so that we could have programmable data. In our user test, we used raw pressure data as parameters for different values.
The biggest feedback we got from users was that they didn’t feel in control of the visual. This was a tricky problem to solve, since we were reading arbitrary pressure values without a unit, and each pressure value came in as a different channel of data. In TouchDesigner, you can only use these channels individually. Figuring out the logic for how we were going to combine all this data was our challenge.
We continued to program the Arduino and changed our code to only send one value at a time based on a threshold. Our code (which we adapted from Ines, who is so awesome) contained a calibrate function, which calculated the baseline pressure values of the velostat. This is necessary because of how sensitive the velostat is to pressure. Once the baseline pressure values are sent in the serial monitor, the mat is “calibrated” and can then send the current pressure values as well as the difference in pressure. We continued to manipulate the serial monitor and started to only read one value at a time. We realized this was the key to working with the data in TouchDesigner. Our final code logic ended up working such that if a certain point is pressed beyond a difference threshold, two values between 0.1 and 1 that are assigned to each row and column appear in the serial monitor. For example, for Row 1, Column 1, the values are 0.1, 0.1. For Row 3, Column 5, the values are 0.5, 1. When no data is being read, the serial monitor will send zeros.
Once we figured out our Arduino code, we moved into TouchDesigner, where we started thinking about programming a visual. We tried to map our grid to the TouchDesigner grid, so that if someone stepped in the top left corner, particles would appear in the top left corner of the visual. This was difficult since TouchDesigner doesn’t use units for its x, y, and z values. We came up with this version and tested it with users:
We realized this visual wasn’t compelling enough to dance, and still might make users feel self-conscious about their movement, which was the opposite of our intention. I explored our visual and realized we could use our normalized values to exert different amounts of force on the particles we made in TouchDesigner. Each point would exert different amounts of force in either External Force, Wind, or Turbulence. The kick drum of the audio would scale up the entire visual, creating a light pulsing effect. Finally, we used the webcam to create a secondary visual which we learned from this tutorial and perfected using Michael’s help.
Result
This was our final interaction at the winter showcase! We installed our piece securely to the ground with a frame of duct tape and an additional layer of vinyl. Two speakers framed the front of the mat, and we used our old time-of-flight box to hide our breadboard and wires. Finally, the visual was displayed on a large TV right in front of the mat. We also used the VB Audio Cable Driver to create an internal audio cable that brought audio from Spotify directly into TouchDesigner.
Conclusion
In the future, I’m excited to see where this project can go! I think there are quite a few possibilities for larger-scale installations, with a bigger mat and multiple screens. I would love to try to install our piece in the Audio Lab and utilize immersive sound as well.