Monday, December 10, 2012

Introducing CMUnistrokeGestureRecognizer

How would you go about recognising a gesture like this star shape in an iOS app?
This was the problem posed to me recently while working on a project. I knew I wasn't the first to need to solve this type of problem. For example, the popular Infinity Blade series of iOS games used a shape drawing gesture for spell casting.

Looking further back into the past, you might remember Palm OS Graffiti used a similar gesture recognition technique for text input.
Something that all these gestures have in common is the requirement to recognise shapes drawn from a single path. That is, the path drawn by a user between putting their finger down and lifting their finger up.

Knowing this problem had obviously been solved, I began researching techniques for single path recognition and that's when I found...

$1 Unistroke Recognizer

Created by three clever chaps at the University of Washington back in 2007, the $1 Unistroke Recognizer was designed to recognise single path (unistroke) gestures, exactly what I was looking for. Not only that, but design goals for the technique make it an ideal candidate for use in mobile applications:

  • Resilience to movement & sampling speed
  • Rotation, scale, position invariance
  • No advanced maths required (e.g., matrix inversions, derivatives, integrals)
  • Easily implemented with few lines of code
  • Fast enough for interactive use
  • Define gestures with minimum of only one example
  • Return top results ordered by score [0-1]
  • Provide recognition rates competitive with more complex algorithms

A pretty bold set of requirements, but it looks like they were able to achieve them all. The creators published their paper describing their technique and included full pseudocode. Their project website also includes a demo implementation written in JavaScript so you can test it out live in the browser.

$1 Unistroke Recognizer website
$1 Unistroke Recognizer paper (PDF)


CMUnistrokeGestureRecognizer is my port of the $1 Unistroke Recognizer to iOS. I'm not the first to implement this recogniser in Objective-C but none of the existing implementations met my requirements. I wanted the $1 Unistroke Recognizer to be fully contained within a UIGestureRecognizer, with as simple an API as possible.

So the CMUnistrokeGestureRecognizer implements the $1 Unistroke Recognizer as a UIGestureRecognizer. It features:

  • Recognition of multiple gestures
  • Standard UIGestureRecognizer callback for success
  • Template paths defined by UIBezierPath objects
  • Optional callbacks for tracking path drawing and recognition failure
  • Configurable minimum recognition score threshold
  • Option to disable rotation normalisation
  • Option to enable the Protractor method for potentially faster recognition
The core recognition algorithm is written in C and is mostly portable across platforms. I say "mostly" as it uses GLKVector functions from the GLKit framework for optimal performance on iOS devices. GLKMath functions take advantage of hardware acceleration such as the ARM NEON SIMD extensions, so I like to use them. It wouldn't take much work to substitute the vector functions if someone wanted to use the core C implementation on another platform.

The CMUnistrokeGestureRecognizer implementation sits on top of the core C library and provides the Objective-C/UIKit interface.

To use it, add the CMUnistrokeGestureRecognizer project to your own as a subproject and add the library to your target. In your source file, import the main header:

#import <CMUnistrokeGestureRecognizer/CMUnistrokeGestureRecognizer.h>

In your code, define one or more paths to be recognised. Create an instance of CMUnistrokeGestureRecognizer, register your paths, then add it to a view.

Your callback method will be called whenever a gesture is successfully matched against the template paths you registered.

Here's an example of the key points:

- (void)viewDidLoad
    [super viewDidLoad];

    // Define a path to be recognised
    UIBezierPath *squarePath = [UIBezierPath bezierPath];
    [squarePath moveToPoint:CGPointMake(0.0f, 0.0f)];
    [squarePath addLineToPoint:CGPointMake(10.0f, 0.0f)];
    [squarePath addLineToPoint:CGPointMake(10.0f, 10.0f)];
    [squarePath addLineToPoint:CGPointMake(0.0f, 10.0f)];
    [squarePath closePath];
    // Create the unistroke gesture recogniser and add to view
    CMUnistrokeGestureRecognizer *unistrokeGestureRecognizer = [[CMUnistrokeGestureRecognizer alloc] initWithTarget:self action:@selector(unistrokeGestureRecognizer:)];
    [unistrokeGestureRecognizer registerUnistrokeWithName:@"square" bezierPath:squarePath];
    [self.view addGestureRecognizer:unistrokeGestureRecognizer];


- (void)unistrokeGestureRecognizer:(CMUnistrokeGestureRecognizer *)unistrokeGestureRecognizer
    // A stroke was recognised
    UIBezierPath *drawnPath = unistrokeGestureRecognizer.strokePath;
    CMUnistrokeGestureResult *result = unistrokeGestureRecognizer.result;
    NSLog(@"Recognised stroke '%@' score=%f bezier path: %@", result.recognizedStrokeName, result.recognizedStrokeScore, drawnPath);

See the included demo app for a more detailed example. The demo includes all the template shapes used by the original creators in their own JavaScript demo. The demo app allows you to test the recognition engine, as well as create new template shapes and export shapes out as code for inclusion in your own projects. The demo app is universal for both iPhone and iPad.

CMUnistrokeGestureRecognizer is open source, released under a MIT license. Get it from

I look forward to seeing what developers create with it.

Thursday, October 18, 2012

OpenGL ES with iOS 5 Part 2: Rendering a masterpiece – Swipe Conference 2012

At September's Swipe Conference I gave two talks on OpenGL with iOS. The first talk, "OpenGL ES with iOS 5 Part 1: Learning to draw" was an introduction to OpenGL ES and GLKit. The second talk covered rendering effects in OpenGL using GLKit, looking at the OpenGL debugging and profiling tools that ship with Xcode, and demonstrating how OpenGL can be used for some fancy segue transitions.
In more detail, my talk "OpenGL ES with iOS 5 Part 2: Rendering a masterpiece" covered:

  • Rendering textured triangles using GLKTextureLoader and GLKBaseEffect;
  • Creating cubemaps using GLKTextureLoader;
  • Rendering skyboxes using GLKSkyboxEffect;
  • Rendering reflection map effects using GLKReflectionMapEffect;
  • Demonstration of the Xcode OpenGL ES frame debugger;
  • Demonstration of the OpenGL ES Driver and Analyzer instruments;
  • Demonstration of the OpenGL ES Performance Detective;
  • Performance recommendations specific to OpenGL ES on iOS devices;
  • Demonstration of some fancy custom storyboard segue transitions using OpenGL ES
The slides from the talk are available at or [PDF]

The demo apps used in the talk are all released open source.

SwipeOpenGLTriangles demonstrates rendering textured triangles  –

Swipe3D demonstrates GLKSkyboxEffect, GLKReflectionMapEffect, cubemap textures and indexed vertices –

FancySegue shows how to build custom segue transitions using OpenGL –
All the sample apps are universal and support all orientations.

Also see my post about the first talk: OpenGL ES with iOS 5 Part 1: Learning to draw – Swipe Conference 2012.

Update: the presentation video is now available online at

Tuesday, October 2, 2012

OpenGL ES with iOS 5 Part 1: Learning to draw – Swipe Conference 2012

In September I presented two talks at Swipe Conference in Sydney. The first talk, "OpenGL ES with iOS 5 Part 1: Learning to draw", was an introduction to OpenGL ES and GLKit, aimed at iOS developers new to OpenGL programming.
In the talk I used a simple demo app, SwipeOpenGLTriangles, to demonstrate OpenGL ES rendering concepts with GLKit, such as:

  • Setting up an OpenGL ES scene using GLKViewController + GLKView
  • Rendering triangles (GL_TRIANGLES) and meshes made of triangles
  • Applying vertex colours, using GLKBaseEffect
  • Applying lighting, using GLKBaseEffect
  • Applying texturing, using GLKBaseEffect and GLKTextureLoader
  • Using Vertex Array Objects (VAO) and Vertex Buffer Objects (VBO)
  • Using interleaved vertex arrays (IVA)
  • Animating vertex positions (tap screen to animate between flat triangles and 3D open box shape)
  • The sample app is universal and supports all orientations.

The full source to the demo app is released open source (MIT licensed) at

The slides from the talk are available at or [PDF].

Update: The presentation video is now online at

Friday, May 18, 2012

CMTraerPhysics CocoaHeads Presentation

In March I gave a presentation at Melbourne CocoaHeads about my open source project CMTraerPhysicsCMTraerPhysics is a spring physics engine that I ported to Objective-C/Cocoa, along with some interesting demos for iOS.

Watch "Chris Miles presents CMTraer Physics" on Vimeo (embedded below if your browser supports it).

See the slides at

Thursday, May 10, 2012

Announcing EZForm 1.0 - iOS form handling & validation library

Announcing EZForm 1.0, my open source form handling and validation library for iOS.

The primary goal of EZForm is to simplify form handling in iOS apps, while not enforcing any constraints on the layout and design of the form UI.

EZForm is designed to be decoupled from your user interface layout, leaving you free to present your form UI any way you like. That doesn't mean EZForm won't integrate with your UI. You tell EZForm which of your controls and views you want to handle each form field, and EZForm will take care of input validation, input filtering and updating views when field values change.

EZForm features:
  • Form field types including: text, boolean, radio.
  • Block based validators. User-defined input validation rules can be added to fields as block objects. Some common validators are included with EZForm.
  • Block based input filters. Input filters control what can be entered by the user. For example, an input filter could be added to a text field to allow only numeric characters to be typed. Some common input filters are included with EZForm.
  • Standard input accessory and field navigation. A standard input accessory can be added to text fields by EZForm with one method call. It adds a bar to the keyboard with field navigation and done buttons, similar to Mobile Safari's input accessory. Navigation between fields is handled automatically by EZForm.
  • Automatic view scrolling to keep active text fields visible. With the option enabled, EZForm will adjust a scroll view, table view or arbitrary view to keep the text field being edited on screen and not covered by a keyboard.
  • Invalid field indicators. EZForm can automatically show invalid indicator views on text fields that fail validation. Invalid indicator views can be user-supplied or supplied by EZForm.
EZForm comes with full API documentation, which can be installed as an Xcode document set, for reading within Xcode's document viewer.

EZForm is convenient to use for both simple and complex forms. The source includes a demo app containing both simple and more complex form examples.

Get EZForm from

Friday, April 27, 2012

CMTraerPhysics – Spring physics engine for iOS

Announcing my port of the Traer v3.0 physics engine to Objective-C/Cocoa: CMTraerPhysics.

Traer Physics is a particle system and spring physics simulator, originally created by Jeffrey Traer Bernstein. It is used to simulate spring and attraction forces between particles, and can be used for some interesting effects.

The CMTraerPhysics source includes a sample iOS app containing a number of physics demos, including a Wonderwall-like demo, a cloth physics demo and a spider web demo.  See videos of some of the demos on YouTube.

The demo app is universal so you can play with the physics demos on both iPhone and iPad. The source for all demos is included with the project. Some demos are rendered with OpenGL ES 2.0, some with Core Animation.

The CMTraerPhysics source is available on github at and is released open source under the MIT license.

Tuesday, March 27, 2012

Paper Baron

Late last year I worked on a cool little iPhone game for the Australian Air Force Defence Jobs: Paper Baron. The development was managed by Millipede Creative Development, and the project was produced & designed by GPYR Melbourne.
Paper Baron is a 2D side scrolling game. In the game, you glide a 3D rendered paper plane through a world of paper constructed obstacles and scenery, aiming to fly for as far as possible.

The game adds a social element in the form of user-created Airstrips. Airstrips are basically geolocation fixed leader boards. Any user can create an Airstrip at their physical location. If they are physically close enough to an Airstrip, users can launch their plane from it to challenge the Airstrip's leader board.

Paper Baron has already been awarded with a Gizmodo App Of The Day award.

You can see the game in action by watching the Paper Baron Trailer on YouTube.

Paper Baron is free in the App Store, check it out.

Wednesday, February 1, 2012

My Cocoaheads talk on Augmented Reality with iOS

At the November Melbourne Cocoaheads meeting I gave a talk about Augmented Reality with iOS.
A video of the talk is embedded below (or watch on Vimeo) and the slides are available online.

In the talk I review the current set of available libraries for AR processing on iOS. I run through them relatively quickly, to keep the talk moving, but I do give live demos of each library. The libraries I cover are:

Open Source:

View the slides.

Monday, January 23, 2012

Working With a Famous Blue Hedgehog

Late last year I had the honour of working with one of the most well known computer game characters of all time, the famous blue hedgehog, incorporating him into an interactive augmented reality app for iPhone.
The app is a promotional mini game for Sonic's 20th Anniversary. It is an augmented reality game where users attempt to capture Sonic as he races around, usually too fast for human eyes to see.
We used the String library for augmented reality image recognition handling, a library I highly recommend. String processes the camera input in real time and provides orientation matrices of any of the pre-defined images that are recognised. The app uses this information to render a 3D animated Sonic running through the scene, oriented relative to the marker with 3D perspective. Look at the marker straight on and Sonic runs past in front of you. Look at the marker from a sharp angle and it is possible to see Sonic running in from a distance (or running away into the distance, on the other side).
Update: Unfortunately the promotion has ended and the app is no longer available for download from the App Store. See an archive of the app details.