Replica is a performance using live image processing to explore how the awareness of time affects the perception of self-image. Patterns unfold creating a dialogue between the performer and her representation, bringing the contrast of control and authenticity into question.
The project was developed in Processing using the Most Pixels Ever library. Video was pulled from a Canon 5D Mark ii camera using Canon’s EOS utility and CamTwist to communicate with Processing. To broadcast to each of the three machines powering the video wall, Daniel Shiffman helped us figure out a way to send and receive the images using UDP. There’s a tutorial as to how it was it done up on his site. To control each of the different modes and parameters of the sketch we used a MIDI controller. The code for the project will be available on GitHub soon.
The project was performed by Claire Westby with the song “Lost in the World” by Kanye West.
A huge thanks is due to our instructor Daniel Shiffman and also to Rune Madsen, Nien Lam and Lucas Zavala for all the help along the way. Another big thanks to our Big Screens classmates and the camera crew for documenting the night for us.
You can see a video of the performance here:
More photos from the night: