Robot Opera – Creative Development
I’m currently working on the music and sound design for a major new robotic performance work by Wade Marynowsky called ‘Robot Opera’. The project is funded by a Creative Australia grant from the Australia Council for the Arts. We’ve just completed a 3 week creative development period at The Red Box in Lilyfield, Sydney.
Developed in collaboration with contemporary performance group Branch Nebula, Robot Opera features eight larger than life-sized rectangular robot performers in a one hour work co-presented by Carriageworks and Performance Space at Carriageworks Bay 17 in October 2015
http://performancespace.com.au/events/robot-opera/
CREATIVE TEAM
Artist: Wade Marynowsky
Music and Sound Design: Julian Knowles
Lighting Design: Mirabelle Wouters
Dramaturgy: Lee Wilson
Electrical Design: Ben Nash
Programmer: Imran Khan
Programmer: Adam Hinshaw
The robots operate on a wireless network according to algorithmic principles and choreographed behaviours, incorporating the ability to be responsive to audience interventions. The choreographed behaviours are mapped out via overhead cameras that track and control the robot’s position in in x and y coordinates. Audience members are able to move into the stage and engage with the robots in close proximity.
On a musical level, the work is structured so that the robots form an 8 member ensemble, with each robot capable of producing its own independent sound, local to itself – in effect playing a ‘part’ in the traditional sense within a musical ensemble. The resulting experience is of the robots as moving sound source/performers, performing the score from distributed locations. The musical parts can either be directed from the composer’s computer or can take the form of algorithms with audience input allowed via input from the robot’s onboard sensors.
On a technical level, the project has been based on the MAX and Arduino environments with additional programming from Imran Khan and Adam Hinshaw. Equipped with Kinect v2 cameras, the robots respond to humans by translating their proximity and facial expression into responsively programmed music, sound and light. The robots also contain sensors to detect objects and barriers. These data are then sent across a wireless network to the central control computers operated by Wade Marynowsky and myself.
Data received from robot sensors (arriving at Julian Knowles’ computer)
Broadly speaking, the work is informed by the fields of creative robotics, mediatized performance and interactive media art, and explores new poetics and aesthetics in performance. On a dramaturgical level, the project team, led by Lee Wilson from Branch Nebula has been exploring the aesthetics of non-human performance. This has involved workshopping various dramaturgical concepts and structures.
Lee Wilson, Mirabelle Wouters and Wade Marynowsky discussing structural possibilities during the Red Box residency
The robots each contain a laptop computer, computer controllable DMX and LED lighting systems, battery powered motors, and a local audio playback systems.
Electrical design was undertaken by Ben Nash
Formation testing
Movement testing
One thought