Max/MSP/Jitter

Emergence by Hays + Ryan Holladay

This is a recent project I worked on for Hays and Ryan Holladay. The goal was to create a portable DMX lighting system based around a collection of standard lamps, each with individually controllable 3-channel RGB LED bulbs, which could be programmed using my DMaX system for Ableton Live. I provided consulting on the specification of the system and made some specialised Max For Live devices to simplify the show programming.

On the practical side, finding a lighting product was the first challenge. We needed something that could easily integrate into the existing lamps without too much modification, that was simple and quick to set up, and that could survive being connected/disconnected on a daily basis (to cope with the rigours of touring).

The perfect product for this was the Glasson DFS3000 system. These high-brightness LED bulbs allow full 4-channel RGBW control in a standard Edison-Screw format, meaning they could be attached directly to the existing lamp holders. DMX and power are supplied in a combined signal, and units can be daisy-chained to the power-supply which takes a standard DMX input. The massive advantage of this system was that the only modification required for the lamps was to cut off the plug and solder on a male and female 3-pin XLR connector, all the rest of the internal wiring in the lamps could stay intact.

On the control side, Hays and Ryan are the first users to test my new DMaX replacement software LXMax, which I will be posting more information about shortly, but it enables a whole new way of working with DMX in Max and Max For Live. For them I created a device which controlled the fixtures in groups of 5 (or individually if required) and allowed the colour to be specified in terms of hue, saturation and brightness, which was then converted to the equivalent RGB value and outputted via Art-Net.

This was a fun project and I think the results look very cool, and demonstrate well the potential of integrated music and lighting control.

imp.push Beta Released

After a month and a half of exhaustion on various smallish jobs, I arrived home to find a shiny new Ableton Push 2 controller waiting for me.

There's a huge amount of potential in this device, although my thoughts are probably not in the area of it's manufacturer's intended use! I see it as a high-quality RGB grid controller and a set of re-purposable function buttons, with the massive bonus of a totally customisable built-in LED display, which runs at desktop display refresh rate.

In order to start getting some of my plans for integration with various software going, my first order of business was the prove that I could output my own video to the display. This proved nicely easy thanks to Ableton's API documents and libusb, and after a few hours of effort I have a pretty simple Max object which happily streams Jitter matrices at 60 Hz over USB.

In the interests of sharing, I've released this external on Github, and you can also download it from my max externals page. I'd like to expand the scope of the external in future to include parsing for all the button LEDs (from their raw MIDI note/controller values) and also full LED control. If you'd like to help out, head over to Github and send me a pull request.

Quantized Crossfader Device

A quick one. Have added my quantized crossfader device for the Max For Live Devices page. This is a small device for automating fades on Live's crossfader at quantized beat intervals. This created for a show where I was running visuals with the help of Ableton, and needed a way to do fades at musical intervals to match the performance. I'm sure it has applications in more conventional (sound and music) use of Live as well. Enjoy!

imp.midi - Cross-Platform Unrestricted MIDI in Max For Live

My imp.midi suite of externals is now available for download. These objects are a re-implementation of the standard Max MIDI objects using a cross-platform MIDI library (RtMidi). This means that you can use them in Max For Live devices to access the system MIDI ports in an unrestricted way. And…. you can get SysEx!

Only caveat is that on Windows, you have to disable any MIDI input/output port you want to use. If Ableton is trying to use the port at the same time the objects can’t connect to it. For this reason, I’ve added a status outlet to each object to indicate whether the port connection was successful or not. Other than this extra functionality, the objects are feature-clones of the originals.

MSC For Live Updated to Version 2

I've just updated MSC For Live, a Max For Live device for controlling Ableton Live via MIDI Show Control commands.

A quick explanation of the purpose of this device, and why MSC control is useful. Software like QLab has become the de facto standard for cue-based theatrical sound playback and triggering. However, whilst its features are many, it has no understanding of musical time, structure or tempo. For example, I can create a cue to play a piece of music starting from 10 seconds in, and then another cue to fade it out after 5 seconds, but I can't create a cue to start it at the 6th bar, or to stop it in a musically/rhythmically appropriate place.

The only kind of software that does have these kind of capabilities is musical software, DAWs (Digital Audio Workstations). So to get some musical intelligence which we can control imperatively (via a cue-stack) and link into the wider show control system, we need a way to get control information into the software. Hence MSC for Live, providing a way to get cue-based control data into a software package (Ableton Live) that understands music and can factor in these ideas when responding to commands. It's more akin to cueing a conductor/MD than cueing button presses on a tape player. For example, a conductor wouldn't stop the entire band just because you cued the next scene, they would wait until an appropriate rhythmic point before hitting the cue.

Of course, Ableton's capabilities extend far beyond just audio playback. You could go as far as creating an entire theatrical score dynamically from virtual instruments and cued MIDI clips. You could bring in live audio streams (of real instruments perhaps) and use clip automation to control processing and levels. You can feed click tracks to live performers. By connecting the DAW into a theatrical control system, lots of interesting possibilities emerge.

On to details of the update. There are no new features in this version, however the internal structure has changed. All MSC parsing is now performed by my new imp.msc.parse Max external (which I will shortly be releasing in a standalone form). This improves performance and removes a lot of complicated parsing logic which was previously handled in the patch directly.

Additionally the SysEx data input is now handled via my own imp.sysexin external. This is part of my new imp.midi suite of MIDI objects for Max. 'What? Surely Max already has MIDI support?', I hear you say. Well yes it does, but the MIDI support available when running in Ableton is crippled to the selected port and channel for that track. Plus, no SysEx. This is a really, really annoying exclusion, and one that definitely should have been sorted by now, but such is life (and software development). In the previous version of MSC For Live, SysEx input was handled using the lh_midiin objects by Leigh Hunt, however this is only available for OS X, whereas the imp.midi suite is intended to be cross-platform. The current build functions in Windows, however it doesn't work there in the Max runtime hosted by Ableton. I'll be putting up a post (and making the rest of the imp.midi objects available) at a later date, hopefully with the Windows versions complete.

On top of both of those things, I've modernised the look of the device to match some of the stylish new-style Max For Live devices. Please go download, give it a test and ping me if you have any problems or queries.

Update (8/3/2016)

I've solved the issues with the Windows build of imp.midi, and updated the device to version 2.01. The issue was due to Windows being unable to connect to MIDI ports that are currently in use. It seems that Max does not actually attempt to open it's MIDI ports until a midiin or midiout object is created, whereas Ableton opens them all at application load. Fortunately the solution to this is simply to disable remote/track/sync input from the MIDI input to be used for SysEx input. This prevents Live from opening the MIDI port and allows my external to open it. To make this clearer to the user, I've added an error display to the device which indicates if it failed to open the port and what the likely solution is.