Tracy was invited to participate in the MTF Performance Lab in at MTF Berlin in May 2016. She collaborated with a team of interdisciplinary artists consisting of experts in the field of brain interface technology, IRCAM gesture sensors, MIT media Lab, fashion and wearable design, performance, new software written for generative and responsive projection mapping, interactive LED light systems and sensor implants. The team integrated all this technology within three days to create a high tech performance in collaboration with Viktoria Modesta. Viktoria is a bionic artist and multimedia performance artist. Check out here amazing prototype video if you haven't seen it.
Tracy designed the audio interaction, provided a short remix of the artist’s track, intergrated a stunning remix produced by Arielle Esther, pulled together a range of data sources, categorized the data, created systems in max for live to then map the data to volume parameters and sample clips within Ableton Live.
Viktoria had three sensors, The first was attached to her head by Mu Arts (Horácio Tomé Marques and Francisco Marques-Teixeira) and read her brainwave activity. Two data sets were collected based on if Viktoria was in a meditative state or state of concentration, Mu Arts then sent the data to Tracy via OSC. Two Ri-OT sensors were also placed on Viktoria. Firstly a sensor was placed on her prosthetic leg. Her leg now became an instrument that allowed her to trigger samples every time she stamped her foot. Also she could change the colour of the LED lights by moving her leg. This was set up by Jonathan Rutherford. The second sensor was placed on here wrist allowing her to trigger sounds by banging her fist in the air.
The experience was documented in Wired magazine and a documentary was produced by MIT media lab featured below.
The performance also featured so many talented artists including Cyril Laurier & Maya Benainous from Handcoded -Interactive projections mapping and costume light, Joanna Hir & Winde Rienstra, who worked on Viktoria's costume, Jonathan Rutherford - Interactive lighting and tour control and Katia Vega & Xin Liu from MIT Media Lab - LED skin and nail controller.