All Posts Filed in ‘Projects


★ Lights On!

So it turns out, Screaming Rapture made the cut, it’s going to be a part of Vivid Sydney this year!

Pretty cool. I’m completely out of the loop on the current progress, news or other exciting developments on the artwork, given that I’m in Central America.. So whilst I’ll be more of a shadow operator in the artwork this time, I’m super excited and wish my two partners in crime the best in pulling this one off!



★ The scream

The other day I mentioned that I’d put together a small artwork concept for the upcoming Vivid Sydney light festival.  Turns out that Screaming Rapture made the first cut, and is on it’s way to melting your hearts.

We’ll be meeting the organisers next week to assure them — once again — that yes, we do know what we’re doing.

When I know more, dear readers, you will too.


★ Introducing Screaming Rapture

I’d like to introduce to you a new project I’ve had the pleasure to work on, an artwork concept for the 2012 Vivid Sydney lighting show, called Screaming Rapture. It’s a new artwork that Liam Ryan, Frank Maguire and I have cooked up and submitted as part of the light art walk competition for the Vivid festival, which will kick off midway through this year. This concept is still in those lovely early days, when so much is still so possible, in that I hit submit late last week and won’t know for sure (yay or nay) until the end of the month. Still, it’s real in our eyes.

The artwork is essentially a light-sound experiment, which was inspired by our experience of Vivid last year. You may recall the three of us created an artwork call Social Firefly, which was essentially a small community of artificial robots that could talk to each other using light. They each had a small amount of personality, but over time the whole community tended to develop communal or shared behaviour. It was emergent & beautiful to watch. Now, we’d intended on the artwork itself being emergent, and possibly sparking emergent behaviour through direct interaction (you could influence the artwork if you had a bright enough torch or source of light), but what we got was something completely far beyond our wildest expectation. What we found, was that despite the festival being primarily a festival of light, many people seemed to take it on board that this artwork was an artwork of sound. So what? you might say? People didn’t get understand your artwork, maybe your artwork didn’t communicate properly to it’s audience.

Well, maybe. But I’m not interested in thinking critically about why the thing didn’t work — I’m interested in thinking about the truly unpredictable behaviour that did emerge during the festival. Night after night, wave after wave of people descended on the large fig tree on Circular Quay, to shout, wave and scream really really loud noises at the fireflies. They made a real ruckus, dancing and shouting at what were, effectively, small deaf robots.

I’ll let you think about that for a moment.

Now what was the cause of this bizarre and unpredictable behaviour? Well I think I know the answer, and I’d better have it right (because we pretty much based our new artwork on this interesting dynamic). The reason I think this happened, was this;

    On a dark night, amongst a large crowd of people, amongst many dispersed but bright lighting artworks, it can be rather hard to see the next artwork until you’re right in front of it. Furthermore, at night sound travels exceptionally well, even when absorbed by the large throng you’re moving in. It really travels. People walking around 100m away from your artwork hear people making noise, and naturally assume that it works with sound.

So when you add these two simple effects together, you find that when people can only really see your artwork from up close, but can hear your artwork from 100m away, they arrive with fairly developed ideas about how it works. And no matter how much text you put on illuminated billboards, about how the thing is an “interplay of light”, people will still think it’s all about sound. The idea propagates, becomes the new norm and each new night that comes the phenomenon reoccurs. It’s beautiful. Here’s a video of this phenomenon in action.

Remember: the artwork doesn’t respond at all to sound. These people are performing for both themselves, and for the others around them. I should also mention, that was a fairly quiet example of this interaction, other times the sound was quite a great deal louder than that.

So, how does this relate to Screaming Rapture? It’s quite simple really, the concept we came to, is around connecting these two sensory impulses (light and sound) in a new feedback loop. The feedback loop is quite tight, in that the light will only shine if sound is present, and vice versa. We see it as a change to experiment in creating a truly awesome (in the proper use of the word) whole body sensory experience, building on the wonderfully effective positive feedback loop we discovered with social firefly. In this case, the artwork rewards the viewers acoustic presence by bathing them in one glorious show of attention. I think our liner notes describe the experience pretty well;

Attracted by the lights someone turns and shouts, the radiant luster is once again revealed from the corners of the eye, spreading light rapidly across the face. The pulsating patterns set off a murmer in the crowd, they speak quietly, amongst themselves. The fluttering grows, resulting in more shouts and noise until wave upon wave of light splashes out across the audience. They raise their voices, they call out for more.

The eyelids respond to the sound, opening further, each lid working to reveal the brilliance of the rapture’s gaze. The audience clap, they shout and they cheer until the darkened lids are flung wide open. They scream and shout at the rapture, staring at the resplendence sea of endless white. An ever-moving throng of bodies bathing in mutual attention.

I for one, can’t wait to see this thing in action. You, my dear readers, will be the first to know when it happens.


★ social firefly

A brief disclaimer: The following is an account of a recent design proposal I worked on with a few mates, guys I’ve worked with before, and most definitely will work with again. It’s an account of the exciting combination of ideas, communication and a deadline that can pull the most interesting work out of you, and how this piece came to be. So, without further ado…


Recently Frank Maguire, Liam Ryan and I came together to discuss the brief for the latest Vivid light art festival, slated to run later on this year. The theme of the festival, fiat lux, didn’t give too much away regarding conceptual or curatorial direction for the festival, so we took that as licence to think as broadly as possible. In part the desire to work on something small for vivid was to stretch creative wings which had remained folded since November last year , but for me the idea grew out of a desire to work with these two talented designers on a project of our own direction. We met to talk about ideas and thoughts on where we might take this festival of light, and within a matter of minutes we’d settled on the crux of the piece, the conceptual core which would flow through right to the delivery and communication of the idea.


Our piece, quite simply titled Social Firefly, is an active social network of small electronic fireflies, all vying for the attention of their peers, interacting and communicating primarily through the silent medium of light. Messages spread from one part of the social network to another, only through the incidental connection of neighbouring fireflies, with messaging patterns and behaviour emerging over the colony as a whole. We were inspired by the emergent synchronising patterns of fireflies in real life, and the incredibly fascinating implications of social network theory, taking these two ideas as our conceptual cornerstones, to anchor and frame the design.


It took a few attempts to get the thing off the ground, with Frank being a proud new father, and suffering a few missed connections ourselves, but take flight we did, and we’ve been informed that we’re on the short list for artists to take part in the festival — which of course we find incredibly exciting. To us, the piece has been alive since before we submitted the design proposal, we imagined the movement of the lights, the interaction between socially connected fireflies and those who live more on the fringes, we see the waves of communication propagating over a large tree canopy, observed by curious human observers on the ground, from the nearby ferry terminal, from the high rise buildings that peer over the social miasma that is Circular Quay.


The design of the fireflies is relatively simple — it’s a small arduino controlled robot, with servo motors moving a small 4W LED light, using an array of small LDR (light dependant resistor) light sensors to pick up ambient and nearby firefly light. Each firefly will happily blink and move around by itself, either content to be alone with its thoughts, or eagerly reaching out to others nearby trying to make a connection. When the light sensors pick up light coming from (in all likelihood) the other fireflies in the tree canopy neighbourhood, the otherwise lonely fireflies perk right up, moving around to find their new friend and blinking brighter to signal their existence.We’re interested in a number of things in this artwork, in the interplay between actors in the crude network we’re establishing, the interaction these strange creatures will inspire in their friends. Can’t wait!



★ Redbull Flugtag

Over the last few months I’ve been working on the largest and scariest construction project I’ve had the chance to work on in over a year.. an entry into the Redbull Flugtag competition. The competition ran over the weekend at Mrs Macquaries point, you can see me in the photo above – holding onto the control frame as we go over the edge..The team and I had a lot of fun working on our design – in a nutshell it was essentially a hang glider that could be transformed rapidly to/from a large 4 person tent. The tent/glider transformation takes about 5-10 seconds, and we performed this on stage in front of around 20,000 people (rough guess, give or take 10,000 people or so..) – after which we proceeded to fly the craft straight into the harbour!Needless to say a lot of work went into this project, much more of which is held in our blog, the tent that flew . If you’re interested in bamboo structures that can transform and fly, by all means go have a look..Also worth noting is the fact that out of all the designs we saw on the day, ours was the only craft to make it back alive – unscathed and potentially ready and able to fly again another day. Proud? You don’t know the half of it.


★ simple metro

For students in the 11217 introduction to construction class;Here is an example of a simple metronome counter which moves a line up and down a sketch window over a period of time.  You should be able to follow the comments included to see how you might plug your sensors and motors into this sketch to easily arrange for your motors to move a) automatically left/right over a period of time and b) increasing in intensity based on a reading from your slider/light/proximity sensors.For the purposes of instruction and demonstration, i’ve connected the line movement to simple mouse input – move the mouse left/right to influence up/down motion – so you can see how your sensors might influence pre-programmed movement.Give this sketch a try…


Phidget Interface Kit + Processing

[Update] In the 4+ years since writing this brief note on combining Phidget hardware and Processing, it’s been one of my most popular posts. I seriously doubt that the linked sketch or comments will still be relevant or even work in 2015. Best look elsewhere for advice if you’re still trying to get your phidgets to sing.

Here’s a simple example for connecting a PhidgetInterfaceKit 8/8/8 to a computer, using Processing.

You will need to download the phidget21.jar package from the phidgets website, find it in the ‘Programming’ section in the java examples section.  Once downloaded, just use Sketch, Add File… to add the file to your sketch.  Connect via USB and you’re ready to go.  I’ve labelled variables based on the sensors I was using, feel free to reassign and use as you will.


★ Phidget Servo motor output via Processing

[Update] Download the phidget jar archive here. When downloaded, copy+paste the above code into a new sketch, then select Sketch, Add File… to add the phidget21.jar file to your sketch. After that, plug in the phidget servo + a motor and you’re ready to go.

I’m currently preparing for a short piece of processing tutoring in the Bachelor of Architecture course, commencing tomorrow. The course is a 1st year construction subject, in which the year group is designing modular structures from found objects (read; whatever they can find in large amounts at Reverse Garbage), with a 12-15 unsuspecting students are going to be shown the slippery slope that is processing, all in the aim of augmenting small construction projects with responsive elements.I’ve used the standard phidget servo motors with max/msp before, but I’ve decided to switch focus to processing so such solutions will no longer suffice. There are some benefits to not using max/msp in the university context;

  • Runtime – i can’t count the number of times we’ve built test prototypes and final projects only to see them fall over due to the small number of licences we’ve access to on campus. The runtime solution is acceptible for last minute, last chance, last straw moments but it’s just not good enough for day to day use or experimentation.
  • The extraordinary cost of licensing – the licences bought by the university come at a ridiculous cost, not to mention the involved installation process. Processing by comparison is such a simple install – for both the core components and the additional libraries.
  • Extensibility – max/msp additional objects are always a welcome addition to the program, however the objects themselves tend to be closed off, limited in how much they reveal of their inner workings and can be fairly slow.

To this end I’ve enjoyed the processing learning curve, there’s been more than enough learning resources available online and in book form, so I’m definitely pushing for its’ inclusion in the syllabus in the arch. faculty. Anthony Burke has been teaching processing in the master of architecture course this semester (with assistance from the computation whiz-kid Ben Coorey), so along with the arduino hardware the transition from proprietary to open-source projects is well under way in the DAB. So to proceed with the real agenda of this post, I was searching for simple code to interface with the PhidgetServo motor output units we’ve been using, this time working in processing. I couldn’t find any decent examples online so had to cobble together one myself. Read on for more;…