Bloggle

My wife and I recently made the move from Glasgow, UK to her home province of Beautiful British Columbia™, Canada. While we’re getting set up over here, we’ve been graciously hosted by my parents-in-law. And as a direct consequence, we have all been playing a veritable butt-ton of board games – foremost of which has been Boggle.



When stumped for a gift…

It’s a fun game, and one variant is to sit around without any timer running and openly shout out words (of 4+ or 5+ letters, to keep it interesting) as you see them. After a while, you get the impression that there cannot possibly be any words left to find – but is that really the case? I wanted to find out…

…And I also wanted to avoid using one of the innumerable free solvers available out there, for the sake of – erm, fun?

So, as a weekend project, I wrote the above Boggle solver, publishing for Flash, (and courtesy AIR) Android and iOS, and uploaded the source code to GitHub: https://github.com/hanenbro/BoggleSolver. Please feel free to do what you like with it; the licence is MIT.



Here it is running on a Nexus 7 tablet and iPhone 4S

For anyone looking just to install the app on mobile (for those crucial on-the-go Boggle scuffles), the APK will install for any device set to allow 3rd party apps, but the IPA will only work for jailbroken iOS devices, unless you recompile with your own provisioning profile.

The app is using a modified word list from InfoChimps, with the words shorter than 3 and longer than 16 letters removed (per the rules and limitations of game). That still leaves around 100,000 words, so at least in my usage, it tends to come up with a high percentage of never-before-seen lexical wonders. Educational tool or ego-deflating depression fuel? You decide!

Colour Extractor Utility

This is a quick post to share a utility I put together to extract all unique RGB hex colours from a given image.

My use case: I’m working on a project where a grid of clickable colour swatches needs to be generated. Due to limited space, it’s been difficult to settle on the the specific palette, knowing that the flexibility of an RGB or HSL colour picker isn’t an option. So I’ve been mocking up a few different arrangements of swatches in Photoshop, and experimenting with their layout.

At the same time, things like available space could change at any point, so between that and the need to adjust colours down the line, I don’t want to lay all the swatches out by hand and be tied to one arrangement. The layout has to be generative, and so I just need a raw list of colours to work with. The initial plan was to use the eye-dropper and pick out each one, pasting the hex values into a text file – but then I noticed I’d have to do that 60+ times for each revision, and thought: “Hello?! This is the 90s! Make the computer do it.”

The utility is very simple, and will just sift from the top-left pixel, traversing right, then to the next row – comparing against a running list to identify all the unique colours. Each colour is stored as a hex value (e.g. 0xBADA55) on a new line in the clipboard-able text data. You can then use some basic find-and-replace commands to get the list into a workable format for your needs.

If your image has gaps between swatches as mine does, and a background colour, this will be picked up as the first (or second) colour in the list – so you’ll likely want to trash that before going further. Another potential gotcha is the presence of anti-aliasing in your image, which could add unwanted in-between colours into the list. If you’re dealing with such a palette image, I would consider scaling the image down using a non-resampling algorithm – such as the ‘Nearest Neighbour’ mode in Photoshop. With some experimentation you should be left with only your core palette colours, ready for import into the extractor utility.

JS + Canvas + WebAudio …Pong

My loyal readers (Mum and Dad) will note that the majority of the development related posts here have focussed on AS3 and Flash Player. The usages of both of these technologies have been changing in recent years; for various reasons they are no longer the cool kids. Much has been said about how it happened, and what it means for the web, so I won’t revisit that heavily-trodden-yet-still-shifting ground, but for me it has signalled that – for much of the front-end web development work I have enjoyed and would like to continue with – open web technologies (primarily HTML, JavaScript and CSS3) are replacing Flash as the toolset of choice.

Thus!: I present this here write-up of a project I tasked myself with earlier in the year; to achieve a number of the things I’m very comfortable with in AS3, but using JS and the canvas element as the primary technologies.

The Game

A simple and familiar premise: using a mouse pointer or finger, move the left paddle to keep the ball block out of the gutter and hold-out until the CPU player can’t keep up with your incredible reaction times. The first player to reach 10 points wins the game, to little or no fanfare.

Approach

Due to the number of developers transitioning from Flash/AS3 to HTML5 right now, a number of technologies, methodologies and perhaps some other -ologies have been created to ease the process and help map skills to these new disciplines. Specifically, I’m thinking of cross-compiling languages, some of the Adobe Edge tools, and JS suites/libraries geared towards this. I looked into a good few of these, but was left with the uncomfortable feeling that, to varying degrees, they obfuscated what went on behind the scenes. Higher level solutions have their appeal (and I honestly don’t like working more than I need to), but it inevitably becomes necessary to implement behaviour outside of the scope of these aids, or at least to know how and why they work the way they do. It’s not that I expected to run into such scope issues while making a crappy Pong clone, but as this was a learning exercise, it paid to avoid any easy, abstracted solutions.

And so, with this in mind, I tried to work with as few packaged solutions as possible, and to use raw, naked (nude, unclothed) JavaScript and HTML. Attractive additions included jQuery, TypeScript, the CreateJS suite and Pixi.js, and I look forward to employing some or all of them in subsequent endeavours – but for now it’s good to know that these aren’t necessary to create a small, self-contained demo.

I also wanted this to be very mobile-friendly; perhaps the primary reason for Flash being shunned so severely was its shaky (and actively shaken) mobile presence, in conjunction with the massive surge in web consumption on portable devices. I’m not doing myself any favours if the game runs on only a couple of desktops/browsers. (Although on a related note, today’s front-end web developer has more to worry about from low adoption of modern HTML5-compliant browsers than from the rather impressive Flash Player penetration, but once again the mobile factor makes this less applicable.)

Tools

Almost all the coding for this project was completed in Sublime Text 2, due to its lightweight footprint, syntax highlighting, unobtrusive completion, and – I’m ashamed to say – the sexiest colour scheme I’ve seen in a text editor. I also dabbled with JetBrains’ Webstorm IDE, but found it unnecessarily bloated for this scale of project. Partly, I think I missed the conveniences of FlashDevelop, and resented the trifling, arbitrary differences, which are somehow easier to ignore in a bare bones text editor. I’ll be giving Webstorm another shake down the line though.

I tested primarily on Win8+Chrome, and ran my mobile tests on an iPhone 4S, an Android Nexus 7 tablet, and a Blackberry Z10. For the iPhone and tablet, I tried out Adobe’s Edge Inspect, but sadly found it to provide little insight or purpose beyond conventional device-browser testing. The remote inspection was the most attractive feature, but it turned out to be only partially supported and unreliable beyond that. I was later able to debug an iDevice issue in OS X by using Apple’s iOS Simulator, bundled with Xcode.

Graphics

For the early iterations, my graphics consisted of calls to canvas.getContext("2d").fillRect() to draw white rectangles on a black background. This is more than adequate for a Pong clone but not suitable for most real-world projects with any kind of art direction (although flat design is à la mode…), so I drew up some assets in Photoshop:

  • A paddle
  • A ‘ball’
  • Digits 0 – 9
  • A frame for the play-area
  • Some CRT monitor -style scanlines

…And gave all the main assets a retro green monitor glow, laid each one out in place, then exported it as a homebrew spritesheet. From there, the fillRect() calls were replaced by those to canvas.getContext("2d").drawImage(), which allows sampling a specified area from a loaded image asset, transforming it and drawing it into the canvas.

Audio

One of the biggest challenges I ran into while building the game was getting the sound effects to play consistently across every device. Audio support is still one of HTML5′s weakest areas, when compared directly to Flash. Support and behaviour varies greatly across platforms, with iOS Safari being the most troublesome. In the end, I settled on the following configuration:

  • Audio is initialised and loaded in only upon the first user interaction (mouse pointer or touch) event, as the mobile browsers disallow playback that’s not seen to be ordained by the user
  • The Web Audio API is used as the primary playback mechanism, due to its relatively low latency and versatility
  • Older HTML5 audio tags are used where Web Audio is unavailable

Chrome for desktop was the most accommodating, and played back both audio forms with no issues. On Android to date, only the beta version of Chrome has support for Web Audio, but with this enabled performance is great; with the fallback, sound is badly delayed. On iOS Safari, Web Audio works fine once unlocked with a user input event, but audio element tags are hopeless: only one such clip can be loaded into a given page for playback. Blackberry 10′s browser also played audio well, but I didn’t bother to check which method it was using. I merely shouted, “Music to my ears!”, and tossed the phone back to my wife.

In summary…

I hope that proves to be of some use to fellow weary ex-AS3 devs, blowing into town on the advice of Old Man Google. The game uploaded here has the minified code, but to have a proper dig around, feel free to check out the GitHub version. Any questions or comments will be answered gladly. Thanks!

Update (26/8/2014):

Chrome changed its implementation of the Web Audio API to stop recognising the (now deprecated) ‘noteOn’ and ‘noteOff’ functions, in favour of ‘start’ and ‘stop’. So that has been updated here and on GitHub. I also added a new CRT-scan effect to the design.

A sunset time-lapse

I went on an old-school family holiday last week, and took the opportunity to capture some of the lovely Scottish scenery. It was Kirkcolm, in the Rhins of Galloway, and I shot a time-lapse of the sunset from the back deck.

After some experimenting I settled on one shot every 5 seconds, left it on manual and set it to initially overexpose quite a bit, to get the most out of the foreground, at the expense of the sky and with the hope of greater longevity as the light went.

I shot in raw so, with relief, I was able to recover a lot of the blown-out detail from the sky. I settled on a group of raw import settings that worked well for the first few images I looked at, then set them up to batch-process in Photoshop. When I went to sequence these in After Effects though, it left the latter end of the sunset too dark for too long, so I ended up deriving a new group of settings that brought the most out of the under-exposed end of the sequence. Finally I lined the two sequences up and faded the brighter version in gradually, as though the exposure had been adjusted manually during the capture.

The flickering specks you can see are (I’m told) swifts and some other small birds, who were nesting (and swooping maniacally) in and around the house. They are a little distracting but I haven’t yet had the heart to content-aware them out. Please sir, they’re only little.

Hello

I'm Adam Vernon: front-end developer, free-time photographer, small-hours musician and general-purpose humanoid.