Portfolio
Hello! below I've provided some examples of my recent work in creative technology, immersive theatre and A/V design.
Seagull Simulator (2025)
Synopsis: An immersive VR theatre piece developed for the National Theatre of Scotland's XR Residency.
My Role & Contribution: For this project, my task was to build a cohesive system that could seamlessly blend the audience's virtual experience in the Quest VR headset with physical sensations in the real world. I achieved this with a control system and cueing library designed in TouchDesigner. This system acted as the show's main interface, using OSC to synchronize the Unity application with a multi-channel spatial audio soundscape, DMX lighting, and physical hardware including a wind machine. A fun and interesting part of the project was technical prop design. I created a custom, Bluetooth-controlled seashell that acted as a unique, intimate sound source for the audience member, controlled live from the main system. This allowed audience members to freely interact with the seashell with the audio transmitted wirelessly. The end result was a complex but easy to operate system designed to be run by a single operator during the live show. This was all developed in a fast-paced, highly collaborative environment with the artistic team.
Technologies Used: TouchDesigner, Unity, DMX, Ableton Live
The Dreaming (2025)
Synopsis: A contemporary live performance piece by artist Althea Young, presented at Buzzcut's "Double Thrills".
My Role & Contribution: I was involved from the early stages of this project as a key collaborator, contributing to devising and dramaturgy, but my primary role was to design the show's expansive sound world. In the live performance, my role is both technical and creative. Using a custom system built in Ableton and TouchDesigner, I dynamically build and manipulate the soundscape in real-time, reacting to the performance as it unfolds. This system runs in sync with the production's lighting, creating a cohesive and immersive audio-visual experience operating from a single control point.
Technologies Used: Ableton Live, TouchDesigner, Live Sound Operation
BRAINROT (Work-in-Progress)
Synopsis: A live, improvised audio-visual performance based on a custom instrument I developed. The system, built in Ableton and TouchDesigner, dynamically sources and processes internet videos to explore themes of contemporary meme culture and sensory overstimulation.
My Role & Contribution: My primary role was co-designing and building this entire audio-visual instrument from the ground up. The core concept is a synesthetic parity between sound and image: the visuals are not a backing track, but are generated and processed in real-time by the exact same data that creates the soundscape. I use MIDI controllers to manipulate a suite of custom post-processing effects (delays, granulation, feedback loops) that affect both audio and visuals simultaneously, building an immersive and often overwhelming theatrical experience. This work is currently in active development. I have provided a link to an excerpt from a recent performance of the project.
Warning: The linked video contains rapidly flashing images.
Technologies Used: TouchDesigner, Ableton Live, MIDI, Real-time Media Processing
Thank you!. Feel free to browse my main site or get in touch.