Emergence by Hays + Ryan Holladay

This is a recent project I worked on for Hays and Ryan Holladay. The goal was to create a portable DMX lighting system based around a collection of standard lamps, each with individually controllable 3-channel RGB LED bulbs, which could be programmed using my DMaX system for Ableton Live. I provided consulting on the specification of the system and made some specialised Max For Live devices to simplify the show programming.

On the practical side, finding a lighting product was the first challenge. We needed something that could easily integrate into the existing lamps without too much modification, that was simple and quick to set up, and that could survive being connected/disconnected on a daily basis (to cope with the rigours of touring).

The perfect product for this was the Glasson DFS3000 system. These high-brightness LED bulbs allow full 4-channel RGBW control in a standard Edison-Screw format, meaning they could be attached directly to the existing lamp holders. DMX and power are supplied in a combined signal, and units can be daisy-chained to the power-supply which takes a standard DMX input. The massive advantage of this system was that the only modification required for the lamps was to cut off the plug and solder on a male and female 3-pin XLR connector, all the rest of the internal wiring in the lamps could stay intact.

On the control side, Hays and Ryan are the first users to test my new DMaX replacement software LXMax, which I will be posting more information about shortly, but it enables a whole new way of working with DMX in Max and Max For Live. For them I created a device which controlled the fixtures in groups of 5 (or individually if required) and allowed the colour to be specified in terms of hue, saturation and brightness, which was then converted to the equivalent RGB value and outputted via Art-Net.

This was a fun project and I think the results look very cool, and demonstrate well the potential of integrated music and lighting control.

imp.push Beta Released

After a month and a half of exhaustion on various smallish jobs, I arrived home to find a shiny new Ableton Push 2 controller waiting for me.

There's a huge amount of potential in this device, although my thoughts are probably not in the area of it's manufacturer's intended use! I see it as a high-quality RGB grid controller and a set of re-purposable function buttons, with the massive bonus of a totally customisable built-in LED display, which runs at desktop display refresh rate.

In order to start getting some of my plans for integration with various software going, my first order of business was the prove that I could output my own video to the display. This proved nicely easy thanks to Ableton's API documents and libusb, and after a few hours of effort I have a pretty simple Max object which happily streams Jitter matrices at 60 Hz over USB.

In the interests of sharing, I've released this external on Github, and you can also download it from my max externals page. I'd like to expand the scope of the external in future to include parsing for all the button LEDs (from their raw MIDI note/controller values) and also full LED control. If you'd like to help out, head over to Github and send me a pull request.

PosiStageDotNet v0.1.0 Released

In my day job at Green Hippo I've been working on various automation tracking projects recently, implementing a variety of proprietary automation UDP protocols. However one of these was PosiStageNet, a open protocol for positional data streaming maintained by VYV.

The protocol allows transport of position, speed, orientation and status/naming information for multiple defined 'trackers' using a nicely-extensible chunk format. After some discussion with VYV I requested an addition to the specification for some extra tracker information, adding acceleration and target position (this is version 2.01 of the protocol), allowing for more accurate positional estimation on the client side.

On the official site you can download a C++ implementation, and I originally developed a C# wrapper around this. However wrappers often feel like an ugly solution to me, so I decided to replace this with a full C# implementation. In the name of standardization it seemed sensible to make it both a portable library (creating some interesting opportunities for mobile usage) and open-source. After a few scattered days work over the last month or so, I've just released the first version of Imp.PosiStageDotNet via NuGet.org.

My hope is that with this library (and the original C++ implementation) the entertainment control industry can start to unify around a single, well-defined and appropriate protocol rather than the back-of-the-envelope ASCII formats which many have been using to date. Of course, as with all standardization attempts this may be futile, but at least I have something to recommend when asked to specify a protocol!

 

Quantized Crossfader Device

A quick one. Have added my quantized crossfader device for the Max For Live Devices page. This is a small device for automating fades on Live's crossfader at quantized beat intervals. This created for a show where I was running visuals with the help of Ableton, and needed a way to do fades at musical intervals to match the performance. I'm sure it has applications in more conventional (sound and music) use of Live as well. Enjoy!

imp.midi - Cross-Platform Unrestricted MIDI in Max For Live

My imp.midi suite of externals is now available for download. These objects are a re-implementation of the standard Max MIDI objects using a cross-platform MIDI library (RtMidi). This means that you can use them in Max For Live devices to access the system MIDI ports in an unrestricted way. And…. you can get SysEx!

Only caveat is that on Windows, you have to disable any MIDI input/output port you want to use. If Ableton is trying to use the port at the same time the objects can’t connect to it. For this reason, I’ve added a status outlet to each object to indicate whether the port connection was successful or not. Other than this extra functionality, the objects are feature-clones of the originals.

MSC For Live Updated to Version 2

I've just updated MSC For Live, a Max For Live device for controlling Ableton Live via MIDI Show Control commands.

A quick explanation of the purpose of this device, and why MSC control is useful. Software like QLab has become the de facto standard for cue-based theatrical sound playback and triggering. However, whilst its features are many, it has no understanding of musical time, structure or tempo. For example, I can create a cue to play a piece of music starting from 10 seconds in, and then another cue to fade it out after 5 seconds, but I can't create a cue to start it at the 6th bar, or to stop it in a musically/rhythmically appropriate place.

The only kind of software that does have these kind of capabilities is musical software, DAWs (Digital Audio Workstations). So to get some musical intelligence which we can control imperatively (via a cue-stack) and link into the wider show control system, we need a way to get control information into the software. Hence MSC for Live, providing a way to get cue-based control data into a software package (Ableton Live) that understands music and can factor in these ideas when responding to commands. It's more akin to cueing a conductor/MD than cueing button presses on a tape player. For example, a conductor wouldn't stop the entire band just because you cued the next scene, they would wait until an appropriate rhythmic point before hitting the cue.

Of course, Ableton's capabilities extend far beyond just audio playback. You could go as far as creating an entire theatrical score dynamically from virtual instruments and cued MIDI clips. You could bring in live audio streams (of real instruments perhaps) and use clip automation to control processing and levels. You can feed click tracks to live performers. By connecting the DAW into a theatrical control system, lots of interesting possibilities emerge.

On to details of the update. There are no new features in this version, however the internal structure has changed. All MSC parsing is now performed by my new imp.msc.parse Max external (which I will shortly be releasing in a standalone form). This improves performance and removes a lot of complicated parsing logic which was previously handled in the patch directly.

Additionally the SysEx data input is now handled via my own imp.sysexin external. This is part of my new imp.midi suite of MIDI objects for Max. 'What? Surely Max already has MIDI support?', I hear you say. Well yes it does, but the MIDI support available when running in Ableton is crippled to the selected port and channel for that track. Plus, no SysEx. This is a really, really annoying exclusion, and one that definitely should have been sorted by now, but such is life (and software development). In the previous version of MSC For Live, SysEx input was handled using the lh_midiin objects by Leigh Hunt, however this is only available for OS X, whereas the imp.midi suite is intended to be cross-platform. The current build functions in Windows, however it doesn't work there in the Max runtime hosted by Ableton. I'll be putting up a post (and making the rest of the imp.midi objects available) at a later date, hopefully with the Windows versions complete.

On top of both of those things, I've modernised the look of the device to match some of the stylish new-style Max For Live devices. Please go download, give it a test and ping me if you have any problems or queries.

Update (8/3/2016)

I've solved the issues with the Windows build of imp.midi, and updated the device to version 2.01. The issue was due to Windows being unable to connect to MIDI ports that are currently in use. It seems that Max does not actually attempt to open it's MIDI ports until a midiin or midiout object is created, whereas Ableton opens them all at application load. Fortunately the solution to this is simply to disable remote/track/sync input from the MIDI input to be used for SysEx input. This prevents Live from opening the MIDI port and allows my external to open it. To make this clearer to the user, I've added an error display to the device which indicates if it failed to open the port and what the likely solution is.

 

 

Cross-Platform Drawing with Skia and WPF

Skia is a cross-platform 2D graphics library, most notably used by Google Chrome and Chromium OS. SkiaSharp is a new C# cross-platform wrapper around the library.

Why is this interesting? Well it's a massive step towards being able to make cross-platform custom GUI controls. I generally prefer to use native OS controls in my application UIs, and this is usually not too time consuming to do and additionally allows you to tailor the interface to that system. However I'm soon starting work on a project which requires a large amount of totally custom widgets, and the thought of coding those individually for each platform was making my knees weak...

Enter SkiaSharp. Now I have a 2D drawing API which I can access from a portable class library. I can now write all of my custom widgets in cross-platform code. Then I can hook them up to a shell class for each platform which exposes properties, hooks up mouse and keyboard input and renders the Skia canvas to the native GUI system.

To test this idea out, I've created the WPF version of this shell object. In this case it's an abstract control (usable from any XAML document), which hosts a Skia canvas inside a WritableBitmap, allowing composition inside a WPF UI. To implement a Skia-based control, inherit from this class and override the Draw() method. Here's the source:

    /// <summary>
    ///     Abstract class used to create WPF controls which are drawn using Skia
    /// </summary>
    [PublicAPI]
    public abstract class SkiaControl : FrameworkElement
    {
        private WriteableBitmap _bitmap;
        private SKColor _canvasClearColor;

        protected SkiaControl()
        {
            cacheCanvasClearColor();
            createBitmap();
            SizeChanged += (o, args) => createBitmap();
        }

        
        /// <summary>
        ///     Color used to clear canvas before each call to <see cref="Draw" /> if <see cref="IsClearCanvas" /> is true
        /// </summary>
        [Category("Brush")]
        [Description("Gets or sets a color used to clear canvas before each render if IsClearCanvas is true")]
        public SolidColorBrush CanvasClear
        {
            get { return (SolidColorBrush)GetValue(CanvasClearProperty); }
            set { SetValue(CanvasClearProperty, value); }
        }

        public static readonly DependencyProperty CanvasClearProperty =
            DependencyProperty.Register("CanvasClear", typeof(SolidColorBrush), typeof(SkiaControl),
                new PropertyMetadata(new SolidColorBrush(Colors.Transparent), canvasClearPropertyChanged));

        private static void canvasClearPropertyChanged(DependencyObject o, DependencyPropertyChangedEventArgs args)
        {
            ((SkiaControl)o).cacheCanvasClearColor();
        }

        /// <summary>
        ///     When enabled, canvas will be cleared before each call to <see cref="Draw" /> with the value of
        ///     <see cref="CanvasClear" />
        /// </summary>
        [Category("Appearance")]
        [Description(
            "Gets or sets a bool to determine if canvas should be cleared before each render with the value of CanvasClear")]
        public bool IsClearCanvas
        {
            get { return (bool)GetValue(IsClearCanvasProperty); }
            set { SetValue(IsClearCanvasProperty, value); }
        }

        public static readonly DependencyProperty IsClearCanvasProperty =
            DependencyProperty.Register("IsClearCanvas", typeof(bool), typeof(SkiaControl), new PropertyMetadata(true));

        /// <summary>
        /// Capture the most recent control render to an image
        /// </summary>
        /// <returns>An <see cref="ImageSource"/> containing the captured area</returns>
        [CanBeNull]
        public BitmapSource SnapshotToBitmapSource() => _bitmap?.Clone();

        protected override void OnRender(DrawingContext dc)
        {
            if (_bitmap == null)
                return;

            _bitmap.Lock();

            using (var surface = SKSurface.Create((int)_bitmap.Width, (int)_bitmap.Height, 
                SKColorType.N_32, SKAlphaType.Premul, _bitmap.BackBuffer, _bitmap.BackBufferStride))
            {
                if (IsClearCanvas)
                    surface.Canvas.Clear(_canvasClearColor);

                Draw(surface.Canvas, (int)_bitmap.Width, (int)_bitmap.Height);
            }

            _bitmap.AddDirtyRect(new Int32Rect(0, 0, (int)_bitmap.Width, (int)_bitmap.Height));
            _bitmap.Unlock();

            dc.DrawImage(_bitmap, new Rect(0, 0, ActualWidth, ActualHeight));
        }

        /// <summary>
        ///     Override this method to implement the drawing routine for the control
        /// </summary>
        /// <param name="canvas">The Skia canvas</param>
        /// <param name="width">Canvas width</param>
        /// <param name="height">Canvas height</param>
        protected abstract void Draw(SKCanvas canvas, int width, int height);

        private void createBitmap()
        {
            int width = (int)ActualWidth;
            int height = (int)ActualHeight;

            if (height > 0 && width > 0 && Parent != null)
                _bitmap = new WriteableBitmap(width, height, 96, 96, PixelFormats.Pbgra32, null);
            else
                _bitmap = null;
        }

        private void cacheCanvasClearColor()
        {
            _canvasClearColor = CanvasClear.ToSkia();
        }
    }

Looking for something interesting to demonstrate the concept with, I've written a quick C#/Skia copy of this gif by Dave Whyte, who makes some amazing 2D animations using Processing. Checkout skiasharpwpfextensions on github and the build the main solution to try it out.

I'm excited to explore further how well this concept will work, my next step will be to built a Cocoa equivalent for use in OS X.