New Things

Oh… hi! Originally, my opening gag for this post was going to be something along the lines of: ‘Huh. This thing still works…?’ – however, the joke is firmly on me since the ‘’ domain has just reached its year anniversary, and without an up-to-date card on file to renew my ownership, expired unceremoniously. So, no; that thing does not still work. (If you go there now you’ll find a charming ad-laden placeholder and an invitation to step in and claim that red-hot goldmine. Feel free, entrepreneurial reader, to wreak havoc with my brand identity.) Though the option’s still open to renew, I realised that this inventory should now be at the center – rather than the kinetic typography generator to which this blog was first tacked-on. Et cello… a new domain!

So, post shake-up, said generator can be found at, the old Hope Park Square back-up is now at, while this guy is at the www subdomain, or just plain Why do I get the feeling it wasn’t many posts that ago I was just explaining the last domain set-up…? Enough meta-posting; content!:

I am employed as a Flash developer now. Sweet! This has exactly two benefits:
1) Career finally on track to Coolsville (definitely in an outer borough already), and
2) People asking what I do, unaware of the platform, just think I’m a developer, but a particularly fancy (and arrogant) one. Score.

I work for the wonderful Thought Den <3 – check them out, yo!:

Thought Den, or: Ed('s) Thong Hut

I’ve been made to feel more than at home by Ben and Dan, awesome director-fellows and Kings of Thought Den. They even made a hugely awkward and dorky (completely unfair and impromptu) film of me screwing around with an iPhone-OSC-Flash thing I put together, partly based on the learnings of my previous post on OSC-to-Flash music visualisation. If you can stomach it, read and watch here.

This job move has also meant a spatial move; 302 miles south (woo!) and 70 to the east (boo!). Let’s see, that puts me in … oh, Bristol! Neat! I couldn’t find my camera while I was packing to move (four hours before my flight…), so forgive the sub-par photography, but the Bristol adventure has so far resembled this:

Not Bristol but Troon, in the bleary small hours, packing unfinished - 'Time I took a picture!'

At the airport, in a PA deadzone, obliviously writing an email and missing the boarding call (but only a little bit)

I think I might live along here one day.

Some, uh- scaffolding. It was very, very blue.

Convinced I'd dreamt this, yet there the picture was, in my phone. My clothes and hair were also filthy, and I found a bloodied half-brick, bundled up in my jacket.

What lovely colour sense the engineers and forsakers of this bridge had!

Ah, lucky cats: quintessential Bristol

An economy religion

I might've known it would be a seagull, captaining the shellfish-restaurant-boat

And if thou gaze long into a spiralmonkeyface, the spiralmonkeyface will gaze also into thee.

A more opulent, cobwebbed religion

This isn't actually a photograph but a screencap from when I was wandering through a VR simulation from the 90s. Shortly after, green teapots whizzed past me to collide with a man made entirely of chrome. The shattered pieces then morphed into the word 'cool!' in Comic Sans. Terrifying.

No, naughty cat! Stop it! This isn't your house. But help yourself to Crazy Landlady's frozen food (somehow kept outside during summer...).

*sigh* Doesn't it just make you want to pedal an old bicycle, go buy some bread maybe?

In other Flash-activity / Flactivity, here is a little …Flupdate: I made a desktop clock (a deskclock, if you will) gadget, deploying through Air. It runs like this:

Note: displayed time correct only occasionally; click for greater accuracy

It needs polishing up and some customisation options, since it’s hardcoded to suit my desktop (above the cat), so I’ll add a nice menu and post it here with the source and everything – like a real community-oriented developer! ALSO: The mouse-position wobble thing was a late and non-final addition. The new Flash Player 10.1 and Air 2 runtimes yielded some bizarre corruption, with Sprites seemingly masking-out each other in places they shouldn’t, and at irregular intervals. Bug report en route – but for now I’m having to update all TextFields each second, and ensure the clock is redrawn every frame (not the sveltest of overheads), so I decided to do at least something with the spent processor time, useless as it may be.

Pledge: next update in less than a month.

MIDI-driven Flash: Synaesthesia for Everyone!

Typical music visualisers, as found in desktop media players, have limited means to analyse the music played through them. Audio data from a mixed-down track is a blend of waveforms, normally from many sources. Spectral analysis allows amplitude variation at disparate frequencies to be detected and graphed discretely, which can assist in beat detection and tempo analysis. But due to the noisiness (…) of this data, huge variation in compositional arrangements and the need for realtime processing, the majority of visualisers have limited reactivity and focus on producing partially-randomised, generative animation that would be pretty even if running on white noise.


But that’s all for raw audio. Samplers and virtual instruments (rendering MIDI data) have ever-increasing prominence in music creation and live performance – and greatly outclass standard hardware MIDI synthesizers, with which the typical notion of dorky, low-quality MIDI playback is inexorably tied. Compared to an audio stream, ‘message’-based music protocols describe events – normally pitch and velocity/volume information about notes triggering on and off – and nothing about the timbre of the sound, allowing sonic quality (for instance, the performing instrument) to be altered at any point.

Visualising musical data in this form removes all uncertainty around the timing and nature of note events (along with the overhead of detecting such details), accommodating far tighter correspondence between the audio and visual elements than is possible with conventional visualisers.

The Future

And none of this is new – VJs have been leveraging MIDI precision for years, and the field is still growing – but routing note data into Flash is new to me, and something I’d wanted to do for years, provoked by rhythmic music that carried a strong sense of motion, or was otherwise very visual. (Lookin’ at you guys, MSTRKRFT and Susumu Yokota <3 – among many others.)

There are several possible starting points for running MIDI into Flash, but none especially mature (at least on Windows; different options are available for OS X). The primary issue is that the Flash runtime has never interfaced with environment-level MIDI APIs, which is pretty much in-keeping with conventional use of the platform. Various workarounds are possible, and having read around a lot I settled on the following:

  • Translate the MIDI data into OSC (a newer protocol and likely eventual successor to MIDI, designed to run through the network transport layer), with the VST plugin, OSCGlue, created by Sebastian Oschatz.
  • Receive the OSC packets with the Java flosc server, created by Benjamin Chun, and retransmit them over TCP (OSC data is normally sent over UDP, which Flash does not support).
  • Receive the shiny new TCP-delivered OSC packets with the OSCConnection classes, an XMLConnection extension written by Adam Robertson and revised by Ignacio Delgado.
    (Worth mentioning: the updated flosc server (v2.0.3) given at the Google Code link above didn’t work for me, but I had no problems with the original on Ben Chun’s site (v0.3.1).)

Here is a demo of it running, routing percussive hits from Ableton Live:

The visualisation method (best referred to, generously, as a ‘sketch’ for now), comprises an array of ‘particles’, distributed in 3D space, accelerating and rotating in time with note-on messages. On snare hits, the background gradient is also switched-out for another. The particles themselves comprise a few other graphical experiments: all symbols are a singular colour, but appear different due to variations in their layer BlendMode; I also created a sliding shutter-mask for the otherwise circular symbols in the Flash Pro IDE before embedding them; and I attempted a depth-of-field blur effect by increasing filter strength with distance from the camera – but it’s all a little abstract for that to be very applicable.

You can play with the visualiser ‘offline’, using keypresses: arrow keys (or WASD for the homicidally inclined) to move the particles; spacebar to swap the background. (Note: you’ll probably have to click into the player first.)

Some disclaimers: most of the work outside of Flash was that of the developers cited above; I don’t mean to take credit. This write-up is mostly to aid visitors with the same intentions, riding in on the Google express, since no other documented method I came across suited my needs and/or worked.
And, secondly, the visualiser itself is intended primarily as a tech demo and precursor to a more complete, aesthetic (…synaesthetic?) piece. You’ll see – it’ll be just like having a real mental disorder.

Rain simulation

As a generative animation exercise (and also just because, um, rain is nice-?), I decided to code me up some good ol’ precipitation. Considerations included: degree of realism, adaptability and performance. Specifically, animating a single layer of tear-shaped drops falling uniformly in perfectly straight lines is relatively easy to do, and certainly has its place within the right art style. On the other hand, ultra-realism would entail thousands of varied and varying particles, deeply distributed in 3D space – something that would better suit pre-rendering than realtime animation, or at the very least, something other than Flash. But that would be no fun at all. I opted for a low-ish number of convincing enough, but inaccurately shaped, particles (apparently raindrops are actually round, mushroom-top blobs – who knew?!), with fairly realistic motion and three distinct layers for the parallactic goodness. I tried to keep everything as configurable as possible, so fall speed, wind strength, blustery-ness, particle graphic, drop size variation and viewport size can be adjusted easily. Reuse or lose! – as they say. Well, they ought to start saying that.
On with the rain!:

You may notice that in the front and backmost layers, the drops are a little blurred. I wanted to give a narrow depth of field effect, and had initially applied a BlurFilter to their containing sprites, but that hit performance hard – so I drew a separate drop symbol for their layers, softening the perimeter with a radial gradient down to transparent, which is just as cheap performance-wise as a solid drop.

Thinking a little more about the worth of such an effect, I realised that I would likely want the rain to interact with whatever other objects might be in the scene. And so I added an umbrella:

It didn’t make much sense for the umbrella to stop drops across all layers, but the effect wasn’t very noticeable with so many other drops falling past it, so I removed them for this demo. I also added some physics-y hinged swinging based on the umbrella’s sideways movement. A splash animation plays at the exact intersecting point of drop and canopy – but I only later realised that, since around 75 drops make contact every second, I could easily have randomised the locations of the splashes and simplified the collision detection. Still! – honesty is the best policy, and when it’s only drizzling, accuracy of the contact point is more relevant.

So now I have almost everything I need for my ‘KEEP THE LOST KITTIES DRY’ minigame. No need to thank me just yet, world.

I drew the visual assets in the Flash CS3 IDE, compiled my classes in the wonderfully lightweight FlashDevelop, and animated the drops using Tweener. That’s all!

Winter observations

The following is from a roll of film I had developed just before Christmas, with pictures shot in November and December, on location in Glasgow and Troon. I used a Ricoh KR-10 camera with 35mm B&W film (ISO 400) and a dreamy Pentax Asahi SMC lens borrowed from my father with a heroic f-number of 1:1.2 / 55mm.

A grey Glasgow Green

Goose Jail: this fella's doing time for honking after 11pm, but he's in with the arm-breakers all the same. I slipped him some cigs; he'll be OK.

The cloud factory, working overtime for Glasgow

Each time I looked up, the dark triangles drew closer and moved down a level. I woke up tasting blood, the last 8 hours unaccounted for, with the ability to recite Pi to a million decimal places.

I like to be upfront with my UFO spoofs when it comes to suspension mechanisms.

A sculpture of rust

Shot? Shut???

I know it's rude to take overt pictures of people walking past in the street, but I genuinely don't think she saw me do it.

I could just detect a continual dull scream emanating from within this bronze sculpture. Neat.

Labradogs sleep with both eyes closed, but with nostrils primed and twitching, ready for action at the drop of a cake.


"And this is some sort of typewriter-television, is it?"

Hey, this is an OK photo. Maybe I'm starting to understand people...?

- No. I selected inanimate objects for focus over actual human beings.

What animal could have made such tracks? A bird?

Uh- a large, wingless, galloping bird...?

A self-duplicating, seemingly aggressive but ultimately friendly group of birds...???

Yep. Bird tracks.


I'm Adam Vernon: front-end developer, free-time photographer, small-hours musician and general-purpose humanoid.