Typical music visualisers, as found in desktop media players, have limited means to analyse the music played through them. Audio data from a mixed-down track is a blend of waveforms, normally from many sources. Spectral analysis allows amplitude variation at disparate frequencies to be detected and graphed discretely, which can assist in beat detection and tempo analysis. But due to the noisiness (…) of this data, huge variation in compositional arrangements and the need for realtime processing, the majority of visualisers have limited reactivity and focus on producing partially-randomised, generative animation that would be pretty even if running on white noise.
But that’s all for raw audio. Samplers and virtual instruments (rendering MIDI data) have ever-increasing prominence in music creation and live performance – and greatly outclass standard hardware MIDI synthesizers, with which the typical notion of dorky, low-quality MIDI playback is inexorably tied. Compared to an audio stream, ‘message’-based music protocols describe events – normally pitch and velocity/volume information about notes triggering on and off – and nothing about the timbre of the sound, allowing sonic quality (for instance, the performing instrument) to be altered at any point.
Visualising musical data in this form removes all uncertainty around the timing and nature of note events (along with the overhead of detecting such details), accommodating far tighter correspondence between the audio and visual elements than is possible with conventional visualisers.
And none of this is new – VJs have been leveraging MIDI precision for years, and the field is still growing – but routing note data into Flash is new to me, and something I’d wanted to do for years, provoked by rhythmic music that carried a strong sense of motion, or was otherwise very visual. (Lookin’ at you guys, MSTRKRFT and Susumu Yokota <3 – among many others.)
There are several possible starting points for running MIDI into Flash, but none especially mature (at least on Windows; different options are available for OS X). The primary issue is that the Flash runtime has never interfaced with environment-level MIDI APIs, which is pretty much in-keeping with conventional use of the platform. Various workarounds are possible, and having read around a lot I settled on the following:
- Translate the MIDI data into OSC (a newer protocol and likely eventual successor to MIDI, designed to run through the network transport layer), with the VST plugin, OSCGlue, created by Sebastian Oschatz.
- Receive the OSC packets with the Java flosc server, created by Benjamin Chun, and retransmit them over TCP (OSC data is normally sent over UDP, which Flash does not support).
- Receive the shiny new TCP-delivered OSC packets with the OSCConnection classes, an XMLConnection extension written by Adam Robertson and revised by Ignacio Delgado.
(Worth mentioning: the updated flosc server (v2.0.3) given at the Google Code link above didn’t work for me, but I had no problems with the original on Ben Chun’s site (v0.3.1).)
Here is a demo of it running, routing percussive hits from Ableton Live:
The visualisation method (best referred to, generously, as a ‘sketch’ for now), comprises an array of ‘particles’, distributed in 3D space, accelerating and rotating in time with note-on messages. On snare hits, the background gradient is also switched-out for another. The particles themselves comprise a few other graphical experiments: all symbols are a singular colour, but appear different due to variations in their layer BlendMode; I also created a sliding shutter-mask for the otherwise circular symbols in the Flash Pro IDE before embedding them; and I attempted a depth-of-field blur effect by increasing filter strength with distance from the camera – but it’s all a little abstract for that to be very applicable.
You can play with the visualiser ‘offline’, using keypresses: arrow keys (or WASD for the homicidally inclined) to move the particles; spacebar to swap the background. (Note: you’ll probably have to click into the player first.)
Some disclaimers: most of the work outside of Flash was that of the developers cited above; I don’t mean to take credit. This write-up is mostly to aid visitors with the same intentions, riding in on the Google express, since no other documented method I came across suited my needs and/or worked.
And, secondly, the visualiser itself is intended primarily as a tech demo and precursor to a more complete, aesthetic (…synaesthetic?) piece. You’ll see – it’ll be just like having a real mental disorder.