Multimedia Theater Piece
Ourselves Talk (2012)
A commissioned work for the Biennial Symposium on Arts and Technology at Connecticut College

ourselves_pic

In Ourselves Talk, “Him” and “Her,” a couple at the breaking point, become lost in their respective fantasies, Him in carnal sexuality and Her in a romantic encounter too sweet to be realized. Through Phillip Gulley’s text and live performance of the character “Him” collaged with Akio Mokuno’s vivid soundscape, the abstract noise and music create an alive and visceral environment in which the characters ride out the demise of their relationship. The couple loses the ability to communicate directly as headphones broadcast text that is tightly bound with Akio’s sound performance. Collaboratively generated video is mixed on a screen, the only space where direct interaction occurs, turning living beings into two-dimensional avatars and reflecting their digitally infused existences. The couple slides in and out of each other’s reality as beginnings and endings, desires and experiences, blur.
– Phillip Gulley

 

This is a collaborative theater piece I created with Phillip Gulley, with whom I studied Performance & Interactive Media Arts at Brooklyn College. This piece was commissioned for the symposium on Arts and Technology at Connecticut College in March of 2012. We were artists in residence there, and the work, Ourselves Talk, was performed in the concert after the symposium.We were told to make a twelve-minute performance. This piece is an intensive text-based reading in which Phillip relates his heartbreaking personal story. The performance was combined with various visual sonic representations as a modification of “His” mental flow and memory. Initially, we wanted an actress to play the part of his ex-girlfriend; however, the actress who initially agreed to join us cancelled one month before the performance. At that point, Phillip began making images of girls such as a girl staring at a flickering television and a girl in underwear crawling on a bed and cuddling. The face staring at the flickering monitor represents the desperation the couple is experiencing; the crawling and cuddling images represent Phillip’s personal memories.

Phillip drew the plot, and I organized the pieces into a timeline using Logic Pro software. Most of the visual effects were modified through an electric guitar. I created this using a Max/MSP object called “fiddle” (made by Miller Puckette) where the various pitches and intensities affect the sizes and positions of the geometric 
graphics that were generated. I chose the electric guitar as an input device for manipulation because I wanted to heighten the performance through using a physical expression that could be manipulated. Every time I plucked thestrings, the program drew a line. The loudness is reflected in size, and the pitch is reflected in the position of the center of the circle.

image.1.png

follow the timeline. They included mute, dynamics, delay depth, and speeds. Thus, I did not need to touch a mouse or keyboard once the performance began. Phillip’s narration was on a separate track, which I fed to his headphones to prompt his speech. His live narration went through a different track and was subject to plug-in automation. The recorded girl’s voice was mapped onto different tracks to send signals to the rear speakers. We also used soft synthesizers and a mashed-up found sound track from a commercial as the thematic representation that associates with “Her” fantasy. 

Phillip also used his laptop to create another projection in front of the table where we were performing. The projection imitated the image of communicating with Skype by showing Phillip’s face and “Her” face. The images were controlled by Max Patch and interacted with the intensity of his voice. If he spoke louder, other face images overlapped and flashed with the superimposed modulated images. 

This twelve-minute piece consisted of six segments. Thus, we composed six different patches for each scene. Each time a scene changed, a signal was sent from Logic Pro to Max Patch to prompt a shift to the next patch. 

The use of the patches nearly exceeded the computer’s CPU capacity. As a result, there were several unstable movements. Thus, in order to avoid taking the risk, it might be “safer” to perform this work by using recorded images than the real-time manipulation. Nonetheless, we chose this performance medium despite the possibility that it might be imperfect. We wanted to differentiate ourselves from other performances in the concert that simply used a fixed medium. We wanted to focus on the “liveness” and the dynamism of the simultaneous experience. 

Akio

March 2012