Archive for October, 2012

Visual & performance development – 16/10/12

Yesterday I set up again in the dance studio again. This time I didn’t have an audience, or anyone else using the space at the same time, which allowed a much more careful calibration and a lot of possibility for experimentation.

The best discovery I have got from this session is that with careful calibration, the system will work with stage lights on me, provided they are very low. This allows for a much more visual performance, making the performance less abstract feeling.

This session did highlight an issue with set up time, however. To get the system working this well took a good 2 hours of calibration, between me and another person. This time needs to be cut drastically if I want to be able to tour the piece.

Wednesday, October 17th, 2012 3rd Year, Live, Stuff I've Done No Comments

Live Performance – 12/10/12

A few hours ago I did the first public(ish) performance of this project.

I’d intended to perform at Newport Circus’ skillshare night on tuesday, and then the CSM Salon night yesterday, but in both cases, logistics proved difficult – at circus, people were paying to be there and we had a limited time in the room, so it seemed unfair to get people to stop what they were doing so that I could set up with the lights off, and at the salon night, there was so many people playing, and so many unknowns about the venue, that it wasn’t worth the hassle and didn’t seem fair to get everyone to stop their setting up so i could turn the lights out etc.

So I performed tonight instead from Ben’s suggestion last night, alongside a lineup of him and Kai, in the dance studio in the uni’s city centre campus. This was a far better venue than either of the 2 I had planned to play at as it provided a much darker room, a nicer floor to work on, and a good ceiling height.

Feedback from the audience suggests that the connection between the movements and sounds are definitely getting clearer, this video doesn’t  have good stereo sound though, unfortunately, which diminishes the effect. I should have spent more time on setting up, the size threshold for the balls was a bit too high, meaning i had to move forward to stop them cutting out often, this greatly reduced the usable hight to perform in – you can hear the red ball cutting out at the top of high throws.

Got to try out a load of the ideas I’ve been developing though, which was nice. I really need a space to rehearse in, my bedroom certainly doesn’t have the space to practice any of the actual performance elements of this.

Thanks to Kai for filming!

Saturday, October 13th, 2012 3rd Year, Live, Stuff I've Done No Comments

I suck at blogs – the last few weeks.

First off, I really need to keep posting about dev work, I’ve been doing so much on it the last few weeks – several hours of development most days, yet posting nothing here. I’ll try and do better from now, but it means I’ll be referencing stuff for a bit that I’ve done but not mentioned previously. I’ll try and quickly run through a lot of what I’ve been doing now in brief.

I ‘ve been trying to figure out ways of tracking that are less dependent on a pitch black room.

I started off experimenting with using a Wiimote to track the balls by Infrared tracking (would need an IR LED adding to the balls alongside the colour one, which would be pretty much impossible with the balls I’m currently using, but I only figured that out a couple of days ago while taking apart my old broken green ball to steal the LED from it – red green and blue balls are far better for individual tracking than red yellow and blue that I was using at the end of last year). Wiimotes contain hardware IR tracking of up to 4 points (though it can’t distinguish between different points if they go out of view) and are very easy to interface with as they connect to the Wii using bluetooth. I have been using DarwiinremoteOSC as it outputs the data in OSC format, which is super easy to pick up within Max/MSP.

I got this working nicely, tracking tea light candles as they emit a reasonably high amount of light in IR wavelengths. I was surprised by how much faster the wiimote was at tracking than my patch last year with my webcam – I assume this is because of the tracking being hardware based, and the wiimote possibly operating at a much faster speed (I read somewhere that the wiimote operates at 120fps, though I’m not going to factcheck this right now).

I then started writing an algorithm to map the 2 inputs together, so that the tracking could be done via IR, but cross check with the webcam colour data when a new point is detected to see which ball it is. (getting concerningly close to making a bodged, inefficient cheapo Kinect style thing there) after a lot of maths and a lot of frustration, I simultaneously found out that the algorithm i was probably looking for to calibrate the 2 together from given points was linear regression (Thanks to one Matt Dunn and his PhD involving eye tracking) and figured out that my time would almost certainly be better spent at that time improving the visual tracking that I was using. I may come back to the IR approach at some point, but it’s been abandoned for now.

I then entirely re-wrote my tracking Max patch to use cv.jit – a library of Max/MSP externals made for various computer vision purposes – rather than the simplistic jit.findbounds method I’d been using previously. This new patch allows a lot more flexibility in colour selection – allowing for a quicker calibration time in a new venue and much more flexible lighting scenarios. (though currently still needing to be fixed lighting, changes mess with things.

During this process, I bought a PS3 Playstation Eye to use in tracking. Initially I thought this would enable me to track much faster at 120fps, but due to an error in the driver that I’m using it ended up being tethered at 40fps. This is still significantly better than my macbook’s inbuilt webcam’s 15fps. While the response time of the inbuilt camera was actually surprisingly fine to work with to as far as latency goes during performance, but a higher frame rate does cut down on the effects of motion blur – allowing more accurate position readings. I ordered this about the time I developed the interactive percussion demo.

Over the last few weeks I’ve also had some good correspondences with a few professional circus performers, about getting them involved in the performance too – these have included aerial, slack-line, acro, dance, and juggling performers. The responses so far have been pretty promising.

Saturday, October 13th, 2012 3rd Year, Live, Stuff I've Done No Comments

Motion Tracking and Music – YouTube

Hey, Look what I found!

This obviously is along very similar lines to what I have been doing. It implements some interesting sonic reactions to the music, I will look into how these are made, and whether I’d like to use some of the sonic concepts in my own work.
For me, this is too abstract, as sounds go, and looses interest a little too quickly.

Monday, October 8th, 2012 3rd Year, Inspiration, Live No Comments