Control Methods

For the reactive system I intend to create, I want a very visual method of control – with expansive movements involving the whole body if possible to fully connect the visual and auditory aesthetics of the performance. This relates to what I said before about Ginko’s performance – that when the sounds AND visuals are interesting, the product is far greater than would be expected with both separately. I intend to look at various methods of gestural control, in its different forms.

I’ve looked at several types of control:

Mocap – The uni apparently has a Mocap (motion capture) suit. This would easily be the most interesting to work with, and has the most atributes with accurate tracking that I would be able to work with. It would, however, be very complicated to set up, and it’s not certain that I’d be able to use it.

Accelerometers – Tracking with an accelerometer on each hand could work – I don’t know how accurate they’d be as a practical performance tool, it would be interesting to experiment with some.

Visual tracking – This is potentially the cheapest and most readily available method, ranging from:

Eyetoy – which are apparently relatively easy to hack and fairly cheap now.

Kinect – which I’ve already talked about and deemed impractical for this module.

Infrared tracking – which is relatively simple at a base level, the EyeToy is apparently easy to convert to a IR receiver by removing the IR filter from the camera, and adding a filter made from the inside of a floppy disk or a film negative to provide the opposite function of only allowing IR wavelengths through. IR tracking has the issue of not being able to distinguish between different objects if they cross or get too close, making multiple controls inaccurate.

Visual blob tracking – working the same way IR would but with the visible spectrum. This means simple coloured objects can be used, meaning much less specialist equipment – a webcam would work. this method does obviously put big restraints on light levels in a performance though.

I think blob tracking, while not being hugely accurate or flexible, would be the quickest and easiest system to implement, especially as I have used it before in a personal project last year – tracking a ball around my room, and synthesising sounds based on it. This project last year was purely tech based however, there was never much focus on either the performance element, or making it sound good. It can track multiple objects allowing multiple parameters to be controlled at once. If some form of foot pedal is used as well, it would enable a large array of control to be available with simple movements.

Thursday, March 29th, 2012 2nd year, Emergent Tech, Live