Category:Code

JS + Canvas + WebAudio …Pong

My loyal readers (Mum and Dad) will note that the majority of the development related posts here have focussed on AS3 and Flash Player. The usages of both of these technologies have been changing in recent years; for various reasons they are no longer the cool kids. Much has been said about how it happened, and what it means for the web, so I won’t revisit that heavily-trodden-yet-still-shifting ground, but for me it has signalled that – for much of the front-end web development work I have enjoyed and would like to continue with – open web technologies (primarily HTML, JavaScript and CSS3) are replacing Flash as the toolset of choice.

Thus!: I present this here write-up of a project I tasked myself with earlier in the year; to achieve a number of the things I’m very comfortable with in AS3, but using JS and the canvas element as the primary technologies.

The Game

A simple and familiar premise: using a mouse pointer or finger, move the left paddle to keep the ball block out of the gutter and hold-out until the CPU player can’t keep up with your incredible reaction times. The first player to reach 10 points wins the game, to little or no fanfare.

Approach

Due to the number of developers transitioning from Flash/AS3 to HTML5 right now, a number of technologies, methodologies and perhaps some other -ologies have been created to ease the process and help map skills to these new disciplines. Specifically, I’m thinking of cross-compiling languages, some of the Adobe Edge tools, and JS suites/libraries geared towards this. I looked into a good few of these, but was left with the uncomfortable feeling that, to varying degrees, they obfuscated what went on behind the scenes. Higher level solutions have their appeal (and I honestly don’t like working more than I need to), but it inevitably becomes necessary to implement behaviour outside of the scope of these aids, or at least to know how and why they work the way they do. It’s not that I expected to run into such scope issues while making a crappy Pong clone, but as this was a learning exercise, it paid to avoid any easy, abstracted solutions.

And so, with this in mind, I tried to work with as few packaged solutions as possible, and to use raw, naked (nude, unclothed) JavaScript and HTML. Attractive additions included jQuery, TypeScript, the CreateJS suite and Pixi.js, and I look forward to employing some or all of them in subsequent endeavours – but for now it’s good to know that these aren’t necessary to create a small, self-contained demo.

I also wanted this to be very mobile-friendly; perhaps the primary reason for Flash being shunned so severely was its shaky (and actively shaken) mobile presence, in conjunction with the massive surge in web consumption on portable devices. I’m not doing myself any favours if the game runs on only a couple of desktops/browsers. (Although on a related note, today’s front-end web developer has more to worry about from low adoption of modern HTML5-compliant browsers than from the rather impressive Flash Player penetration, but once again the mobile factor makes this less applicable.)

Tools

Almost all the coding for this project was completed in Sublime Text 2, due to its lightweight footprint, syntax highlighting, unobtrusive completion, and – I’m ashamed to say – the sexiest colour scheme I’ve seen in a text editor. I also dabbled with JetBrains’ Webstorm IDE, but found it unnecessarily bloated for this scale of project. Partly, I think I missed the conveniences of FlashDevelop, and resented the trifling, arbitrary differences, which are somehow easier to ignore in a bare bones text editor. I’ll be giving Webstorm another shake down the line though.

I tested primarily on Win8+Chrome, and ran my mobile tests on an iPhone 4S, an Android Nexus 7 tablet, and a Blackberry Z10. For the iPhone and tablet, I tried out Adobe’s Edge Inspect, but sadly found it to provide little insight or purpose beyond conventional device-browser testing. The remote inspection was the most attractive feature, but it turned out to be only partially supported and unreliable beyond that. I was later able to debug an iDevice issue in OS X by using Apple’s iOS Simulator, bundled with Xcode.

Graphics

For the early iterations, my graphics consisted of calls to canvas.getContext("2d").fillRect() to draw white rectangles on a black background. This is more than adequate for a Pong clone but not suitable for most real-world projects with any kind of art direction (although flat design is à la mode…), so I drew up some assets in Photoshop:

  • A paddle
  • A ‘ball’
  • Digits 0 – 9
  • A frame for the play-area
  • Some CRT monitor -style scanlines

…And gave all the main assets a retro green monitor glow, laid each one out in place, then exported it as a homebrew spritesheet. From there, the fillRect() calls were replaced by those to canvas.getContext("2d").drawImage(), which allows sampling a specified area from a loaded image asset, transforming it and drawing it into the canvas.

Audio

One of the biggest challenges I ran into while building the game was getting the sound effects to play consistently across every device. Audio support is still one of HTML5′s weakest areas, when compared directly to Flash. Support and behaviour varies greatly across platforms, with iOS Safari being the most troublesome. In the end, I settled on the following configuration:

  • Audio is initialised and loaded in only upon the first user interaction (mouse pointer or touch) event, as the mobile browsers disallow playback that’s not seen to be ordained by the user
  • The Web Audio API is used as the primary playback mechanism, due to its relatively low latency and versatility
  • Older HTML5 audio tags are used where Web Audio is unavailable

Chrome for desktop was the most accommodating, and played back both audio forms with no issues. On Android to date, only the beta version of Chrome has support for Web Audio, but with this enabled performance is great; with the fallback, sound is badly delayed. On iOS Safari, Web Audio works fine once unlocked with a user input event, but audio element tags are hopeless: only one such clip can be loaded into a given page for playback. Blackberry 10′s browser also played audio well, but I didn’t bother to check which method it was using. I merely shouted, “Music to my ears!”, and tossed the phone back to my wife.

In summary…

I hope that proves to be of some use to fellow weary ex-AS3 devs, blowing into town on the advice of Old Man Google. The game uploaded here has the minified code, but to have a proper dig around, feel free to check out the GitHub version. Any questions or comments will be answered gladly. Thanks!

Update (26/8/2014):

Chrome changed its implementation of the Web Audio API to stop recognising the (now deprecated) ‘noteOn’ and ‘noteOff’ functions, in favour of ‘start’ and ‘stop’. So that has been updated here and on GitHub. I also added a new CRT-scan effect to the design.

Replacing the buttonMode hand cursor icon in FP 10.2+

Thought I’d share a little snippet from a project I worked on a month or so back – it takes advantage of the Mouse.cursor property introduced in Flash Player 10.2. MouseCursor, as you may know, beats the ass of previous cursor replacement solutions because it modifies the OS pointer, and thus is not at the whim of slow stage framerates or the various other painful snags that come with faking it; hiding the system cursor and manually positioning a graphic in its place.

There’s a decent amount of documentation out there on how to use it, but my specific requirement was to replace the cursor only when mousing over buttons and other clickable items. I wanted it to show up wherever the hand cursor would normally appear, across a whole project.

This is the kind of behaviour I needed:

catswitch
Click to run the example. The buttons show a custom mouse-over cursor only when they’re enabled.

Now, if you look at the Adobe docs on the topic, you can register a cursor by passing in a String constant and a MouseCursorData object like so:

Mouse.registerCursor("PointerCursor", mouseCursorData);

Additionally, you can spy a number of static String constants inside the MouseCursor class – such as IBEAM and HAND – which certainly made me wonder whether I could do something like the following:

Mouse.registerCursor(MouseCursor.HAND, mouseCursorData);

But as you may have guessed, this does not work as hoped; it’s not possible just to override the graphic used for the hand cursor.

So here’s my workaround for achieving this kind of behaviour, which I opt to keep inside a centralised class (the filthy singleton). I’ve set up a git repository for the full example, but here’s the crux (a few extra functions aren’t shown here):

public function buttonMode_set(target:Sprite, buttonMode:Boolean):void {
	
	//If the dictionary hasn't been initialised, do so and set up the custom mouse cursor//
	if (! _buttonModeDictionary) {
		_buttonModeDictionary = new Dictionary();
		var pointerCursor = new MouseCursorData();
		pointerCursor.data = new <BitmapData>[new Cursor32Shadow()];
		pointerCursor.frameRate = 0;
		pointerCursor.hotSpot = new Point(3, 1);
		Mouse.registerCursor("PointerCursor", pointerCursor);
	}
	
	//Check whether the Sprite has already been registered for use//
	if (target in _buttonModeDictionary) {
		
		//Only add/remove listeners if the buttonMode state is actually changing//
		if (_buttonModeDictionary[target] != buttonMode) {
			_buttonModeDictionary[target] = buttonMode;
			if (buttonMode) {
				target.addEventListener(MouseEvent.ROLL_OVER, pointerCursor_show);
				target.addEventListener(MouseEvent.ROLL_OUT, pointerCursor_hide);
				mouseOver_check(target, true);
			} else {
				target.removeEventListener(MouseEvent.ROLL_OVER, pointerCursor_show);
				target.removeEventListener(MouseEvent.ROLL_OUT, pointerCursor_hide);
				mouseOver_check(target, false);
			}
		}
		
	//Only add the new entry if pseudo-buttonMode is being turned on//
	} else if (buttonMode) {
		
		_buttonModeDictionary[target] = true;
		target.addEventListener(MouseEvent.ROLL_OVER, pointerCursor_show);
		target.addEventListener(MouseEvent.ROLL_OUT, pointerCursor_hide);
		mouseOver_check(target, true);
	}
}

And then it can be turned on and off for each object like this:

buttonMode_set(button1, true);	//button1.buttonMode = true;
buttonMode_set(button2, false);	//button2.buttonMode = false;

The full example can be downloaded from github here: https://github.com/hanenbro/CursorDemo

A poor man’s particle system

For a recent Flash job, I was required to add a particle system on top of an application already very CPU heavy. The project involved a large stage size, AR marker detection from a webcam feed, and a 3D-orientated plane displaying a running video, attached to the marker. This video plane then had to spew out various flavours of 3D particle at appropriate moments in the FLV.

The idea of plugging in a sexy particle engine like Flint, when the SWF was struggling even to maintain its target framerate of 25fps, made me uncomfortable. Bespoke seemed the only way to go, and hey! it worked out. Here’s what one of the particle types ended up resembling:

(It’s worth mentioning that fewer particles than can be made in the above demo were needed for the project itself. Maxing out the version here gets my CPU usage up to around 40%, which would not have left enough room for the marker tracking and FLV playback.)

To briefly cover the tricks used here, I faked 3D z-positioning by scaling clips as a function of their bespoke ‘zPosition’ value. The formula: focalLength / (focalLength + zPosition) gives the scale factor used to set the scaleX and scaleY properties of each MovieClip. The scaleX parameter was also adjusted to spoof motion blur, by overriding the x position property, and repeatedly comparing it to its value at the last update. The greater the change, the larger the scaleX multiplier, and the more stretched/blurred the particle would appear.

Rotation of the whole particle field was done by avoiding setting the x or z properties directly, depending instead on ‘radius’ and ‘rotational offset’ values. All the particles reside inside an imagined cylinder (the particle field), with its central spindle aligned with the y-axis. Each particle has its x and z location calculated on the basis of this rotation and distance from the central axis, as they move in their orbits. Channelling Mr McCallum, my secondary school maths teacher, the formulae to do this kind of positioning are, for the x axis: cos(angle) * radius; and for the z axis: sin(angle) * radius. (These can be used instead with the x and y properties to show rotation around a flat circle as opposed to the cylinder.)

In addition to rotary motion, the particles were required to ‘wander’ of their own accord. To achieve this, a large bitmap of Perlin noise is generated at the beginning of the simulation, with a reference to it passed to each particle instance. Going from pixel to pixel each frame, the RGB value is sampled and used to increment the y-velocity and radius-velocity of each particle. Over time, the Perlin noise is traversed completely, resulting in a wide range of even motion. Each particle begins sampling from a randomised row/column too, so that the movement is staggered.

Thanks Ken Perlin!
Setting the Perlin noise generation to tile allows the bitmap to be traversed from side to side and top to bottom without any sudden changes in value.

With all said and done, there may indeed have been a package out there that was equally lightweight and flexible enough for the nuances of the project. But, hard deadlines being as they are, it sometimes pays just to go from the ground up, omitting any chaff on the way – rather than hunt for elusive wheat through sprawling, multifaceted libraries.

If the source code happens to be of use to anyone I’d be very happy to clean it up and post it here. Until then… indolence wins out. Sorry!

Dinoglyphs

Been feeling at all out of touch lately? Wondered what the next big thing’s going to be? You came to the right blog post friend, because I’ll tell you what it is for free:

ANAGLYPHS

ANAGLYPHS

And what would be the perfect companion to this timely tech? Extinct animals. Namely:

DINOSAURS

DINOSAURS

Over the past few weeks at work, I got the chance to develop an educational Papervision3D/FLARToolKit application for the the BBC’s Learning Development site. I wrote up some nerdy/whiney information on the process here, and the page itself can be found at the picture-link below.

SpinARsaurus Challenge

Moving images containing terrible lizards are all very well; Steven Spielberg gave us those in 1993 – more than fifty years ago! But what that documentary promised to be the simple matter of drilling into amber and injecting mosquito blood into a frog, still hasn’t yielded any dino fun parks. What gives?

In order to fill this void (over the next couple of months, before one of the parks is complete and we’re busy petting T-rexes), I’ve made a dinosaur that looks so real you could touch it.

Dinoglyph
It will take a minute or two to load up, so please be patient! (You can’t rush virtual reality.) Moving the mouse around alters the spin speed and direction, and zooms in and out.

For the demo, I loaded in a second skeleton model, iterated over the texture files and ColorTransform-ed the red / green+blue out of them, respectively, then set their BlendModes to DIFFERENCE. After that, I moved the red copy a little left, the blue copy a little right, and applied all other movement relative to those locations.

The eventual colours work out well to be mutually invisible in the corresponding panes of the paper glasses that were bundled with the copy of Ben 10 Magazine I bought in shame. (FYI, decent journalism, but the word searches were MUCH too difficult for the recommended age bracket.) More sophisticated glasses will probably result in uneven colouring.

FUTURE
And you wont look as cool.

Hello

I'm Adam Vernon: front-end developer, free-time photographer, small-hours musician and general-purpose humanoid.