Introduction
For our project, Sofia and I set out to explore the concept of generating visuals through body movement, essentially creating a VJ experience powered by physical interaction. This idea stemmed from our personal experiences as a dancer, where we feel a deep sense of harmony between mind, body, and spirit whenever I dance. For me, music, movement, and visual elements fuse together into a single immersive experience. With this project, I wanted to share that harmony with others, especially those who might not naturally respond to music through dance. I also drew inspiration from dance games like Just Dance and Dance Dance Revolution.
To make this experience accessible, I designed an interactive dance floor where participants could engage with visuals without needing any formal dance background. My goal was to create a setup where anyone could generate dynamic visuals through simple body movements.
I initially considered various visual output methods, such as using LEDs or software environments like P5.js and TouchDesigner to capture and interpret motion. Additionally, I explored the potential of using an Arduino to read motion data and translate it into visuals, experimenting with ways to use the technology to capture even subtle movements.
Brainstorming
In the early stages of this project, we initially sketched a small-scale setup involving force-sensing resistors. Our idea was to use four sensors that would detect finger presses on pieces of cardboard, triggering reactions on a large LED screen. However, as we discussed and refined our concept, we decided to shift towards a larger-scale setup that could be activated by foot movements. This pivot would enable participants to engage with the project through dancing, bringing the physicality of movement more into the experience.
We also moved away from using LEDs as the primary visual medium, as we realized that they limited our ability to create dynamic visual effects. Instead, we chose to use TouchDesigner, allowing us to create a more versatile, audio-reactive visual display that could respond to user inputs in real time.
With this new direction in mind, our next sketch involved designing a dance floor with interactive panels. After considering different materials, we decided to use wood to construct these panels, giving the installation a sturdy, tactile feel. As for the visual design on the TouchDesigner screen, we wanted to keep things simple, especially since this was our first time working with TouchDesigner, and we planned to continue iterating on this project.
We settled on a minimalist approach, focusing the interaction on color changes. The TouchDesigner visuals featured a basic noise pattern, with Z-translation responding to kick and snare audio cues. Meanwhile, the colors of the visuals would shift based on which of the six wooden panels participants stepped on, creating a responsive and immersive experience tied to both audio and physical interaction.
Iterations
Our project development process was divided into two main parts: programming in TouchDesigner and fabrication. I took the lead on TouchDesigner programming, while Sofia worked on fabrication.
TouchDesigner Programming
I’ve never used TouchDesigner before starting this project, so I wanted to think about the different ways we could actually use the Arduino and TouchDesigner:I decided to use moving noise, since it’s easy but still looks visually appealing. I followed two different tutorials, each targeting a key part of the system.
The first tutorial focused on establishing communication between the Arduino and TouchDesigner.
Using this method, I captured serial data from the Arduino and brought it into TouchDesigner. I processed this data by splitting it into six distinct channel operators, with each channel containing three values corresponding to the R, G, and B color components. Since TouchDesigner uses a color range from 0 to 1, I converted the Arduino’s serial data to align with this format.
Receiving serial data from the Arduino, which is connected to six pushbuttons
Creating the component rgbCovert, which converts serial data into six color channel operators, each with an r, g, and b value
Inside the rgbConvert component
I wasn’t sure how to actually use these values; I met with Jess and brainstormed different ways to integrate the Arduino data to the visual.Creating the component rgbCovert, which converts serial data into six color channel operators, each with an r, g, and b value
Inside the rgbConvert component
I integrated the data into a Python script that organized the data within a table:
def onCook(scriptOp):
data_table = op('rgbTable') # Replace with your Table DAT's actual name
data_table.clear()
# Append headers to the table
header = ['pos', 'r', 'g', 'b', 'a']
data_table.appendRow(header)
# List of CHOPs, update this list with your actual CHOP names
chops = ['redRGB', 'orangeRGB', 'yellowRGB', 'greenRGB', 'blueRGB', 'purpleRGB']
valid_chops = []
# Amount of black space as a percentage of the ramp (adjust as needed)
black_space_fraction = 0.1 # 10% black space at each end
# Loop through each CHOP and gather active ones
for chop_name in chops:
current_chop = op(chop_name)
if current_chop is None:
continue
try:
r_value = current_chop['r'][0]
g_value = current_chop['g'][0]
b_value = current_chop['b'][0]
# Check if the CHOP is active by checking if any channel value is non-zero
if r_value != 0 or g_value != 0 or b_value != 0:
valid_chops.append([r_value, g_value, b_value])
except KeyError:
continue
total_chops = len(valid_chops)
if total_chops == 0:
# If no data is coming in, set a default black and white gradient
data_table.appendRow([0.0, 0, 0, 0, 1]) # Black at position 0
data_table.appendRow([1.0, 1, 1, 1, 1]) # White at position 1
return
# Calculate dynamic pos values for the active CHOPs, reserving black space at both ends
pos_range_start = black_space_fraction # Start of the colored section
pos_range_end = 1 - black_space_fraction # End of the colored section
pos_step = (pos_range_end - pos_range_start) / (total_chops - 1) if total_chops > 1 else 0
# Append the initial black space row (optional)
data_table.appendRow([0.0, 0, 0, 0, 1]) # Black at the start
for i, rgb_values in enumerate(valid_chops):
pos_value = pos_range_start + i * pos_step
data_table.appendRow([pos_value] + rgb_values + [1])
# Append the final black space row (optional)
data_table.appendRow([1.0, 0, 0, 0, 1]) # Black at the end
return
This table then fed into a RAMP operator—a texture operator in TouchDesigner that manages visual elements. This pipeline allowed me to map the color inputs from the Arduino into a cohesive visual output in TouchDesigner:The second tutorial I followed was for audio analysis in TouchDesigner.
In my file, I loaded an audio file and used TouchDesigner to analyze the song’s kick and snare drum sounds. This drum data was sent to a noise TOP operator, which moved along the Z-axis in response to the drum beats, adding a dynamic layer of movement.
Finally, I combined the Arduino-controlled color visuals with the audio-driven noise operator to create a unified visual output. The result was a dynamic, color-shifting noise pattern that responded both to foot movements (via Arduino input) and to audio cues (from the drum analysis). In order to make movement more intuitive, I also added a bounce, which triggers rotation on the X and Y axis each time a button is pressed.
Fabrication
The fabrication process began with creating a small prototype to test our button mechanism. Since we only needed the sensor to send a binary value (0 or 1), we experimented with various switch designs before settling on a plan. Our final layout involved arranging six switches around a circular 45-inch by 45-inch wooden floor, each switch representing a different color.Prototype Development
First, we brainstormed different ways to create a switch that could (1) successfully send consistent values to the Arduino and (2) be inviting to step on for users. Nikolai showed us a ring setup that we decided to test out:We constructed a small-scale prototype using a laser cutter to shape two wooden circles and a ring of cardboard. The cardboard ring was placed between the two wooden circles, and inside this ring, we attached a strip of aluminum foil connected to power and ground. Another piece of aluminum foil was placed on the top wooden circle. This setup created a simple switch: pressing down on the top circle allowed the aluminum contacts to meet, completing the circuit and sending a signal.
We tested this prototype by connecting it to a breadboard with an LED and checking its functionality. This test was successful, but wasn’t feasible long-term since the cardboard inside would eventually give.
Board Construction
For the actual board, we cut six larger circles, each with a 20-centimeter diameter, to be arranged around the wooden floor. We also upgraded the materials, using foam rings instead of cardboard, as foam provided greater durability and stability. This setup became the foundation for our interactive dance floor, with each switch designed to correspond to a specific color channel in TouchDesigner, allowing for a reliable, engaging interaction.Final Product
Our final product was a 45-inch by 45-inch dance floor with six circular panels, each 20 centimeters in diameter. These panels were arranged in a larger circle around the board, creating a layout that encouraged participants to move between them. In the center of the circle, we painted a "Start Here" sign, guiding users on where to begin interacting with the installation.
Each panel was painted to match the colors displayed in the TouchDesigner visual, with panels corresponding to red, orange, yellow, green, blue, and purple. This color-coding made it intuitive for users to connect their physical movements on the dance floor with the dynamic visuals on the screen.
Conclusion
Overall, our project received positive feedback, particularly for its visual and auditory engagement, which successfully invited participants to explore movement on the dance floor. The design was intuitive, effectively conveying our goal of encouraging people to dance and interact with the visuals. Many users were drawn to the installation and eager to test it out.
However, some feedback suggested that we could make the visuals more responsive to physical input. Currently, each step triggered only slight changes, such as rotation on the x and y axes and color adjustments in the ramp operator. Moving forward, we’d like to explore ways to make the visuals more dynamic, allowing them to react more distinctly to different types of movement and dance styles.
In terms of fabrication, we aim to refine the setup to create a flat, seamless surface rather than panels that stick out from the ground. This change would allow participants to engage in a broader range of dance movements beyond just stepping or jumping on individual panels, making the experience more fluid and accessible.
Looking ahead, we’re excited to expand the interactivity with additional sensors and more complex visuals. By further developing the project, we hope to create an even more immersive experience that continues to invite people to move, interact, and feel connected through music and visuals.