top of page

SWARM

An Interactive, Augmented Reality Experience

Using JavaScript along with the p5.js and ml5.js libraries, I developed an immersive, interactive human-digital system where a swarm of bees targets and follows a participant. In the default state (with no participant), the bees move slowly, buzzing left and right. In the active state, once a participant is detected, the bees swarm and track the participant's movements until they are no longer within the camera’s view. Click the button below to see what the buzz is all about! Be sure your sound is on, and please use Google Chrome on a laptop or desktop for the best experience.

CLASSES

I began this project by creating classes. A class is a block of code written by a programmer that creates an object. Within this code, I defined the object's properties (What color is it? What size? Where is it? Etc.) as well as the object's methods (What does it do? How does it move? Etc.)

ARRAYS

Then I used arrays to multiply the amount of ellipses. An array is a data structure that is made up of multiple elements.

TARGETING

After that, I created two classes of ellipses, one with movement to the right of the screen, and another with movement to the left of the screen. I also programmed the ellipses to swarm the cursor when the mouse is pressed. Go ahead and give it a shot!


Place your cursor in the black box, press down on the mouse, and drag your cursor around the screen.

POSENET()

Lastly, I imported poseNet(), a machine learning model that estimates real-time human pose data from video input. I replaced the ellipses with a bee I designed in Adobe Illustrator and adjusted the bee’s movement algorithms to target specific points on the participant’s body. To enhance the interaction, I added a buzzing sound effect that activates whenever a participant is detected.

Screen Shot 2021-03-20 at 12.32.38 PM.png

I envision this project being installed in public spaces such as subway stations, malls, or parks, inviting the general public to engage and interact. I’m drawn to the idea of creating installations that transform input from the physical world into temporary works of art. This innovative approach not only enhances the beauty and intrigue of shared spaces but also fosters a deeper connection between people and their surroundings, cultivating a sense of presence and building a shared community through each unique experience.

Culmination Installation Sketch.png
UCSB House Show 2025

My team and I submitted this installation to UCSB’s House Show, an artist showcase featuring work from the Media Arts and Technology, Music, and Art departments. The algorithm behind the piece tracks people as they move through the space—whether they’re passively passing by or actively interacting. Their motion becomes a paintbrush, drawing lines across the canvas. The program cycles through colors and randomly varies the thickness of each line.

 

Every minute, the generated image is saved, and an audio playback sequence begins. The piece maps the y-coordinate of each point on a line to a frequency: higher positions produce higher pitches, and lower positions produce lower ones. The duration of each sound corresponds to the length of the brush stroke. Once the audio finishes, the system resets with a blank canvas.

In this loop, a new, unique audiovisual artwork is created every minute—crafted by the public, whether they intend to participate or not.

View code on GitHub

  • LinkedIn
  • GitHub

​

bottom of page