Review: iSimulate

So you may recall a little while back we mentioned the various options out there for integrating input from the device into your simulator-running application, and today we’re going to delve into just how well the paid option, Vimov’s iSimulate, actually works — since they were kind enough to provide us with a review copy. Let’s see how that works out for them, shall we?

First off, we’re going to try it out with an accelerometer-controlled cocos2d game that we’re going to be working on soon as things slow down a bit around here — so look for that around the year 2015! Ho ho! — but has the accelerometer input working, so makes a solid test.

Having installed iSimulate.app on our device, we start it running, then download the SDK and read the instructions:

1. Add the iSimulate library file named “libisimulate.a” to your Xcode project…

Drag, click, done.

2. Add the CoreLocation framework to your project…

Click, scroll, drag, click, done.

3. Add to “Other Linker Flags” the value “-ObjC”…

Hmmm, that’s an interesting request. And what is that option, exactly? Ah, so that’s what it is. That explains how they do this without any source changes, then. Any-hoo, we do all our configuration in .xcconfig files, so we’ll just edit the base one for this project to

OTHER_LDFLAGS = $(inherited) -ObjC $(TW_CONFIGURATION_OTHER_LDFLAGS)

4. Oh, wait … there is no 4.

Alrighty then. So we run the app, and lookee there, immediately up shows our computer on the device (You remember from above we started it running before downloading the SDK, yes?):

iSimulate_connect.png

Tap that, and in scrolls the active view:

iSimulate_active.png

Very pretty, yes. Now, your immediate reaction is that makes it impossible to use the thing to actually manipulate a touch interface, since you can’t see what you’re touching. But they’ve addressed that in a surprisingly effective fashion; translucent dots appear on the simulator at the points where your fingers are touching, as in this screenshot of three fingers touching, one directly over the’Menu’ button:

iSimulate_multitouch.png

Turns out surprisingly workable, too, if not the most precise.

And speaking of precision … just how precisely does the response in the simulator reflect the actual device input? Hmmmm … well, it’s not absolutely perfect, we definitely noticed a tendency to overshoot the ball’s acceleration on the play sceen. But it is somewhere between “very good indeed” and “excellent”. As well, our test subject here has extremely twitchy response (by design) so we’re inclined to believe that the input lag here is as good as you can reasonably expect anything going over Wifi to be.

We proceeded to try various combinations of stopping and restarting the device app and the simulator app to see if we could manage to confuse it, and nope; managed to detect/connect/reconnect with casual aplomb.

So it works great for an OpenGL game.

For part 2, we were going to try it out with the multitouch resize ‘Fit Pose’ overlay feature of Poses Volume 1 … but somehow it managed to escape us that iSimulate would, in fact, not make the simulator magically have a camera. And that didn’t occur to us until we’d taken the 30 seconds to go through Steps 1-3 to add it to the project and run it, of course. Oops. So while we were there anyways, we tried it out with the swipe navigation gestures in the full screen gallery, and it worked just as expected on those; and since those work, and we can see the pinch/zoom dots rolling around, we’re quite sure if we bothered to enable the transmogrification in those instances the multitouch would work as well as the single touch does.

However, there was one failure we noticed; whilst tapping on an individual table cell works as expected, it did not seem to recognize a swipe to scroll the table. Which, had we bothered reading to the bottom of the documentation page, we would have seen documented:

Due to technical restrictions, iSimulate does not send touch events for the following UIKit objects (as well as any object based on them): Keyboard, UIScrollView (including MKMapView), UIPickerView and UITableView. All of the other UIKit objects receive all touch events. There are no limitations on OpenGL-based applications.

Hmmm. Wonder what’s up with that? Well, you can always just use the computer’s devices to provide those inputs, so that’s just a mild inconvenience. Although it would be interesting to know exactly what these “technical restrictions” would be, curious trolls that we are.

So! What’s the verdict? Pretty much unqualified recommendation, that’s what the verdict is. As you can see above, integration with our test products took longer to liveblog about than it took to actually perform; there was no mucking with the source whatsoever, just adding a couple libraries and a linker flag; the application found risible our best efforts to confuse it; and although it’s not absolutely faithful to how the device reacts natively, it’s very close indeed and probably as good as you can reasonably expect given that there’s a Wifi connection in between. For the $16 it’s priced at as we write this, by the time it saves you half a dozen install cycles it’ll have paid for itself quite handily.

And we haven’t even touched here on the other big benefit of having this around: that when it comes time to make screencasts of your finished product, they’re going to be much more helpful when made off the simulator with this assistance rather than pointing a video camera at your hand blocking the screen like most of the blurry demo videos you see around. See the samples here for how well that works out; those little gray dots really do make the video quite more informative, indeed.

Alex | September 11, 2009

Leave a Reply