Show Control

Emergence by Hays + Ryan Holladay

This is a recent project I worked on for Hays and Ryan Holladay. The goal was to create a portable DMX lighting system based around a collection of standard lamps, each with individually controllable 3-channel RGB LED bulbs, which could be programmed using my DMaX system for Ableton Live. I provided consulting on the specification of the system and made some specialised Max For Live devices to simplify the show programming.

On the practical side, finding a lighting product was the first challenge. We needed something that could easily integrate into the existing lamps without too much modification, that was simple and quick to set up, and that could survive being connected/disconnected on a daily basis (to cope with the rigours of touring).

The perfect product for this was the Glasson DFS3000 system. These high-brightness LED bulbs allow full 4-channel RGBW control in a standard Edison-Screw format, meaning they could be attached directly to the existing lamp holders. DMX and power are supplied in a combined signal, and units can be daisy-chained to the power-supply which takes a standard DMX input. The massive advantage of this system was that the only modification required for the lamps was to cut off the plug and solder on a male and female 3-pin XLR connector, all the rest of the internal wiring in the lamps could stay intact.

On the control side, Hays and Ryan are the first users to test my new DMaX replacement software LXMax, which I will be posting more information about shortly, but it enables a whole new way of working with DMX in Max and Max For Live. For them I created a device which controlled the fixtures in groups of 5 (or individually if required) and allowed the colour to be specified in terms of hue, saturation and brightness, which was then converted to the equivalent RGB value and outputted via Art-Net.

This was a fun project and I think the results look very cool, and demonstrate well the potential of integrated music and lighting control.

PosiStageDotNet v0.1.0 Released

In my day job at Green Hippo I've been working on various automation tracking projects recently, implementing a variety of proprietary automation UDP protocols. However one of these was PosiStageNet, a open protocol for positional data streaming maintained by VYV.

The protocol allows transport of position, speed, orientation and status/naming information for multiple defined 'trackers' using a nicely-extensible chunk format. After some discussion with VYV I requested an addition to the specification for some extra tracker information, adding acceleration and target position (this is version 2.01 of the protocol), allowing for more accurate positional estimation on the client side.

On the official site you can download a C++ implementation, and I originally developed a C# wrapper around this. However wrappers often feel like an ugly solution to me, so I decided to replace this with a full C# implementation. In the name of standardization it seemed sensible to make it both a portable library (creating some interesting opportunities for mobile usage) and open-source. After a few scattered days work over the last month or so, I've just released the first version of Imp.PosiStageDotNet via NuGet.org.

My hope is that with this library (and the original C++ implementation) the entertainment control industry can start to unify around a single, well-defined and appropriate protocol rather than the back-of-the-envelope ASCII formats which many have been using to date. Of course, as with all standardization attempts this may be futile, but at least I have something to recommend when asked to specify a protocol!

 

MSC For Live Updated to Version 2

I've just updated MSC For Live, a Max For Live device for controlling Ableton Live via MIDI Show Control commands.

A quick explanation of the purpose of this device, and why MSC control is useful. Software like QLab has become the de facto standard for cue-based theatrical sound playback and triggering. However, whilst its features are many, it has no understanding of musical time, structure or tempo. For example, I can create a cue to play a piece of music starting from 10 seconds in, and then another cue to fade it out after 5 seconds, but I can't create a cue to start it at the 6th bar, or to stop it in a musically/rhythmically appropriate place.

The only kind of software that does have these kind of capabilities is musical software, DAWs (Digital Audio Workstations). So to get some musical intelligence which we can control imperatively (via a cue-stack) and link into the wider show control system, we need a way to get control information into the software. Hence MSC for Live, providing a way to get cue-based control data into a software package (Ableton Live) that understands music and can factor in these ideas when responding to commands. It's more akin to cueing a conductor/MD than cueing button presses on a tape player. For example, a conductor wouldn't stop the entire band just because you cued the next scene, they would wait until an appropriate rhythmic point before hitting the cue.

Of course, Ableton's capabilities extend far beyond just audio playback. You could go as far as creating an entire theatrical score dynamically from virtual instruments and cued MIDI clips. You could bring in live audio streams (of real instruments perhaps) and use clip automation to control processing and levels. You can feed click tracks to live performers. By connecting the DAW into a theatrical control system, lots of interesting possibilities emerge.

On to details of the update. There are no new features in this version, however the internal structure has changed. All MSC parsing is now performed by my new imp.msc.parse Max external (which I will shortly be releasing in a standalone form). This improves performance and removes a lot of complicated parsing logic which was previously handled in the patch directly.

Additionally the SysEx data input is now handled via my own imp.sysexin external. This is part of my new imp.midi suite of MIDI objects for Max. 'What? Surely Max already has MIDI support?', I hear you say. Well yes it does, but the MIDI support available when running in Ableton is crippled to the selected port and channel for that track. Plus, no SysEx. This is a really, really annoying exclusion, and one that definitely should have been sorted by now, but such is life (and software development). In the previous version of MSC For Live, SysEx input was handled using the lh_midiin objects by Leigh Hunt, however this is only available for OS X, whereas the imp.midi suite is intended to be cross-platform. The current build functions in Windows, however it doesn't work there in the Max runtime hosted by Ableton. I'll be putting up a post (and making the rest of the imp.midi objects available) at a later date, hopefully with the Windows versions complete.

On top of both of those things, I've modernised the look of the device to match some of the stylish new-style Max For Live devices. Please go download, give it a test and ping me if you have any problems or queries.

Update (8/3/2016)

I've solved the issues with the Windows build of imp.midi, and updated the device to version 2.01. The issue was due to Windows being unable to connect to MIDI ports that are currently in use. It seems that Max does not actually attempt to open it's MIDI ports until a midiin or midiout object is created, whereas Ableton opens them all at application load. Fortunately the solution to this is simply to disable remote/track/sync input from the MIDI input to be used for SysEx input. This prevents Live from opening the MIDI port and allows my external to open it. To make this clearer to the user, I've added an error display to the device which indicates if it failed to open the port and what the likely solution is.