Jake MacMullin

Fetching and parsing JSON with Swift by Jake MacMullin

I'm really excited about Apple's new programming language and keen to learn as much about it as I can. I've found that the best way for me to learn a new language is to use it to build something. So I've started working on a new app (albeit a small one) using Swift.

One of the things this app needs to do is to interact with a web service. If I was writing this app in Objective-C I'd probably default to using AFNetworking for client/server communications and perhaps something like Mantle to simplify the process of converting the JSON returned from the server into model objects I can use in the app. However, as I'm using this project as an opportunity to learn about Swift I'm not sure it makes sense to rely too heavily on 3rd party libraries. The main thing I'm hoping to learn is how to solve problems the 'Swift way' (if there is such a thing). I'm worried that if I stick to using the libraries I'm familiar with all I'll end up learning is a new syntax and I'll miss out on the opportunity to think about how Swift's distinctive language features might let me solve problems in new ways. So I started out using NSURLSession and NSJSONSerialization directly to request JSON from the web service and parse it.

Read More

Which iBeacons have the most consistent signals? by Jake MacMullin

We've been working on a number of projects that use iBeacons to provide context about nearby places (and things) to apps. As we move beyond experimentation towards more 'real-world' applications we're noticing that there are a lot of things that make a difference to how well iBeacons work for a given app.

One of the things we've noticed is that the radio signals that we detect vary greatly. There's probably a number of reasons for this: different beacons have different characteristics, the radio signals do different things in different environments, and even do different things in the same environment over time (depending on how many people are in a room for example). To better understand the source of the variation we've noticed, we thought we'd test all of our beacons in as controlled an environment as possible to determine how much of the variation is down to the beacon itself and how much is down to other factors.

We wanted to test two things:
- how strong is the signal from a given beacon at a set distance
- how much does the signal strength vary for a given beacon at a set distance over time

Read More

Discovering Art with iBeacons by Jake MacMullin

5 weeks ago

Five weeks ago the National Gallery of Australia (NGA) approached us to find out what we've been doing with iBeacons. I've been excited about the possibility of using Bluetooth LE devices to provide context to mobile apps since I started playing with the technology in mid 2012. So when Apple announced iBeacons at their annual developer conference last year I was enthusiastic about the sort of applications and experiences they could enable. Since then I've created a few proof-of-concept apps to demonstrate how the technology could be used.

After talking about the technology with the NGA we were all so excited about the possibilities that we decided to jump straight in and create an app for an up-coming event at the National Gallery. Each year the gallery invites families to join in a day of festivities in the gallery's beautiful sculpture garden. Sculpture Garden Sunday brings hundreds of people to the sculpture garden so it was the perfect opportunity to pilot a mobile app that might help them enjoy their visit even more. We discussed a number of ideas for mobile apps that could use beacons to provide people with information about the nearest sculptures as they moved through the garden. However, given the extraordinarily short time-frame for the project we needed to come up with idea for an app that we could create in time for the event so we decided to focus on providing information about a limited number of sculptures.

The idea

We needed something:

  • that would be fun for families to use and help children engage with the sculptures,
  • that would allow people to find out more information about the sculptures than what was available on the labels, and
  • would give us all a chance to try out beacon technology to see how well it works for providing gallery visitors with information about nearby art works.

We proposed the idea of a scavenger hunt. An app that would allow families to find their way around the sculpture garden in order to find five sculptures. Once all five sculptures had been 'collected' in the app people could claim a prize! This seemed to meet all of our criteria. It was a perfect activity for Sculpture Garden Sunday (and beyond), it'd encourage children to really engage with the sculptures they were hunting for (and offer an opportunity to tell them a little more about the sculptures once they found them), and it gave us an opportunity to try out the beacon technology.

The team at the NGA loved the idea and helped take it from a simple concept to a really rich experience by selecting 5 sculptures that led people to all parts of the sculpture garden and gave people a sense of the varied works on display. They developed some great hints to help people find the sculptures and clever questions that people needed to answer once they'd found each sculpture that really encouraged people to look at the works and think about them.

The NGA 'Eye See Art' App Icon

The NGA 'Eye See Art' App Icon

We worked closely with Georgina Ibarra from Ibarra creative to ensure that the app not only works well but looks great and is a joy to use.

The technology

If you're not familiar with iBeacons, they're a simple idea with countless applications. Bluetooth Low-Energy (BLE) devices are small, extremely low-powered devices with radio transmitters that can broadcast over small distances. They've been around for a number of years and have mostly been used to allow a smartphone to pair with a given device in order to share information or allow you to control the device from your smartphone. There are many BLE fitness devices for example. The chips needed to create a BLE device are so inexpensive and small that they could become pervasive. In the future everything might have a BLE chip in it.

One interesting feature about this technology is that in order to allow smartphones to discover a BLE device they broadcast advertisement data. They send out a radio signal at regular intervals saying "I'm here, I'm a BLE device, you might want to connect to me". Another interesting feature is that smartphones that receive this data also know the strength of the radio signal that they're detecting. It turns out that this simple combination of advertisement data and signal strength is really useful. The advertisement data can contain information that tells you what something is: "I'm a heart-rate monitor" or "I'm a BLE tag on a sculpture" and the strength of the radio signal can let you estimate how close you are to that thing. The stronger the radio signal, the closer you are. iBeacons are simply BLE devices that include certain unique identifiers in their advertisement data. Once you know the unique identifiers that belong to each beacon, you can develop apps that do something when they detect a radio signal from those beacons.

Our challenge

In our case, we wanted to use iBeacons to provide people with information about the sculptures near them in the sculpture garden. We decided that we wanted to show people a map of the sculpture garden that was missing the 5 sculptures we wanted people to find. As people approached one of the missing sculptures an indicator would appear on the map showing that they were close. Once they were close enough (right next to the sculpture) they could answer a question about the sculpture to prove they'd found the right one and add it to their map.

This idea posed a few challenges. If we were going to use a map, it'd be nice to give people an indication of where they were on the map. We knew we could use iBeacons to determine how close someone was to a certain point, but we weren't sure that we could secure beacons close enough to each of the 5 sculptures for the app to work as we'd envisaged it.

We started to think about ways we could use iBeacons to solve these problems. If we knew where we'd placed each beacon we might be able to detect the signal strength to multiple beacons and combine the information to triangulate the user and determine where they were relative to all the beacons.

A single beacon lets you estimate how close you are, but tells you nothing about where you are.  Combining beacons lets you estimate where you are (if you know where the beacons are)

A single beacon lets you estimate how close you are, but tells you nothing about where you are.

Combining beacons lets you estimate where you are (if you know where the beacons are)

We even did a little maths:

maths.jpg

However, given the short timeframes for the project, we started looking around to see if anyone else had already solved this problem. It turns out they have. Our timing was perfect. Another local (Australian) company had recently launched a new service that uses beacons to provide location information to smartphone apps in exactly this way. Enso Locate has been developed by Art Processors, the team behind the fantastic system used by the Museum of Old and New Art (MONA) in Hobart, Tasmania. We were lucky enough to be amongst the first people using this platform. It really makes a lot of things easy that might've otherwise been much more difficult. We placed the supplied beacons throughout the Sculpture Garden and used Enso Locate's admin web app to indicate where we'd placed each beacon. We were then able to initialise Enso Locate's iOS library with a JSON file published from the admin tool that included information about all the beacons and the locations we'd placed them. Once we'd done that it was a simple matter of asking Enso Locate for continuous location updates. The Enso Location library takes care of scanning for the beacons and doing the triangulation based on the signal strength. Our app was given two estimates of the user's location every second.

All we needed to focus on was responding to this information about the user's location.

The app does two things with the information about the user's location. It gives the user an indication of where they currently are (or where the app thinks they are) and determines how close they are to the 5 sculptures. The second of these is the simplest, but both required a little bit of work.

Enso Locate provides an estimate of the user's location twice every second based on the signal strength from the surrounding beacons. But the strength of the radio signals fluctuates so Enso Locate's location tends to jump around a bit too. Our first attempt at showing the user where they are resulted in a dot that jumped around the map a fair bit. It was a little disconcerting and gave you the sense that we didn't really know where you are.

We smooth the location updates by calculating an average.

We smooth the location updates by calculating an average.

So we do a few calculations to take an average location rather than use each raw location value. We take the most recent 9 values (for example), sort them based on location and then take the middle 3 values. We then calculate the average x, y location of these 3 values. This introduces a slight delay as each new value only changes the average slightly but resulted in a much smoother indicator. The result is an improvement on the showing the user all the raw location updates but it is still a little noisy. Depending on where in the Sculpture Garden you are we're getter better or worse location information. If you're in one of the worse areas then the location indicator still drifts around a bit. Perhaps we could improve this by using a more sophisticated algorithm to smooth the location data, such as a the Kalman Filter. We could improve our estimates of location further by integrating the triangulated signal strength with data from the accelerometer and gyroscope about the position and momentum of the device.

Another problem we encountered was due to the non-uniform distribution of the beacons in the Sculpture Garden. In order to save time, keep costs down and minimise our impact on the site, we decided to position beacons in existing weatherproof fixtures that were distributed around the garden (mostly light fixtures). This meant we were somewhat limited in where we could place beacons. The garden also has an undulating topography with some sections higher than others. The combination of these factors means that the assumption that a stronger signal means a closer beacon doesn't always hold true. There are some beacons placed high on a hill whose signal can be detected from almost anywhere in the garden. There are others we had to put near concrete walls or metal boxes that can't be detected unless you're very close. Working with an early version of the Enso Locate platform means we're in the privileged position of being able to provide feedback and suggest changes. Once such feature we've requested is the ability to 'tune' each beacon and provide a threshold for a signal strength below which the beacon is ignored. I think this will make a big difference from our beacon on the hill - it'll mean we'll be able to tell Enso Locate "only include this beacon in your calculations if you're detecting a strong signal from it". 

Sculpture Garden Sunday

The app was ready for Sculpture Garden Sunday and dozens of people did the scavenger hunt on the day. It was a fantastic way to get feedback about the app as we were able to talk to people about it, (discreetly) watch them use it and ask them to provide us with feedback. Whilst we and the team at the NGA are still working our way through all the feedback the early signs are that we achieved what we set out to do. We created an app that improved people's experience of visiting the Sculpture Garden. People seemed to genuinely enjoy the scavenger hunt (or maybe they just wanted the prizes!) and we were able to encourage people to engage with the sculptures and use the app to find out more about them.

Next Steps

We're continuing to work with the NGA to make a few changes based on this early feedback and the app will be available in the app store in the near future - make sure you download it before your next visit to the National Gallery.

This app has only made us more excited about how iBeacons can provide apps with more context about the world around you. We can't wait to work with the NGA and others to continue to explore the different ways to use iBeacons to improve visitor's experience.

Storyboards, View Controller Containment and Delegation: Coordinating Responsibilities Within an App by Jake MacMullin

This year marks the 100th anniversary of Australia's capital city Canberra. During this centenary year CSIRO, Australia's peak scientific research body, wanted to reflect on their history in Canberra with an iPad and iPhone app which they asked Stripy Sock to create for them.

The app (available free from the app store) includes stories that can be viewed on a map or on a timeline. It is also a universal app with both an iPhone and iPad version. While both versions allow you to see the timeline and the map, they do so in different ways. On the iPad both the map and the timeline are presented side by side while on the iPhone you can switch between a timeline or map tab.

 

The iPad app showing a map and timeline

This post describes how I used storyboards, view controller containment and delegation so that these map and timeline components could be used in these different ways in both the iPhone and iPad apps.

As I knew that I'd be displaying stories on a map in both versions of the app, I wanted to make sure that any code I wrote in the first version I created (which was the iPad app) could be reused in the next version. But I also knew I wanted different behaviour in each version. Selecting a story on the map in the iPad app should cause the timeline visible alongside the map to scroll to the selected story so you can see both the time and place that relates to the story. However I knew the timeline wouldn't be visible alongside the map on the iPhone. As such I needed to develop the map as a distinct self-contained thing separate from the timeline.

After thinking it through, I determined the app would consist of:

  • A map view controller
  • A timeline view controller
  • A story detail view controller
  • And two container view controllers for the iPhone and iPad versions

The map view controller would be responsible for displaying the list of stories it was given on a map.

The timeline view controller would be responsible for displaying the list of stories it was given on a timeline.

In order to synchronise the map and timeline views in the iPad version I also needed some way to highlight a story on the map or in the timeline when the same story was selected in the other view. However, as explained previously I didn't want to tightly couple the timeline to the map as I knew the two wouldn't appear alongside each other in the iPhone version of the app.

Highlighting a given story was the obvious bit. Both my map and timeline view controllers would need methods to highlight a given story:

// IndexMapViewController.h


@interface IndexMapViewController : UIViewController

/**
* Scrolls the map to the given story and displays its annotation.
*/
- (void)selectStory:(Story *)story;

@end


@interface StoryTimelineViewController : UICollectionViewController

/**
* Scrolls the timeline to the given story and changes the
* style of the story's cell to indicate it is selected.
* Pass nil to clear the selection.
*/
- (void)selectStory:(Story *)story;

@end

Slightly less obvious was how to ensure the timeline's setStory: is called when a story is selected in the map view and vice versa. This is where delegation comes in. I know that I want the map view to inform some other class when a story is selected (though at this stage of the design of the app I wasn't entirely clear which class that'd be) so using a delegate is ideal. It let's me clearly define the different responsibilities. It's a way of saying, if you're interested in knowing when a story is selected, register your interest with me and I'll tell you. In this way, the map view controller's responsibilities are expanded beyond simply showing the list of stories to also include informing an interested delegate when a story is selected. It's not the map view controller's responsibility to do anything about the fact a story was selected, it simply has to inform its delegate.

// IndexMapViewController.h


@protocol IndexMapViewControllerDelegate <NSObject>

- (void)indexMapViewController:(IndexMapViewController *)controller didSelectStory:(Story *)story;

@end

Likewise, the timeline view controller has a similar delegate protocol:

// StoryTimelineViewController.h


@protocol StoryTimelineViewControllerDelegate <NSObject>

- (void)storyTimelineDidSelectStory:(Story *)story;

@end

The final piece of the puzzle is finding a class to take responsibility for acting as the map and timeline view controllers delegate to coordinate the two.

// IndexViewController.h

@interface IndexViewController : UIViewController <IndexMapViewControllerDelegate, StoryTimelineViewControllerDelegate>
@end

Now the division of responsibilities is clear. The map and timeline view controllers take responsibility for displaying the list of stories they've been given and informing their delegate when a story has been selected. The index view controller is responsible for providing both with the list of stories to display and acts as the delegate for both, ensuring that when a story is selected in one it is also selected in the other. But how does all this look in the Storyboard? How does the index view controller obtain references to the map and timeline view controllers in order to pass them their list of stories and register as their delegate? It's all done through the magic of view controller containment.

A section of the storyboard showing an index view controller with container view controllers for the map and timeline view controllers.

In the storyboard, the index view controller's view has two Container View Controllers embedded within it. One for the map and one for the timeline.

However, whilst this provides an easy way to create the parent-child relationship between the index view controller and its two children, it doesn't actually create a reference between the classes. To actually obtain these references and pass messages and data between the view controllers you have to write a bit of code:

// IndexViewController.h

@interface IndexViewController : UIViewController <IndexMapViewControllerDelegate, StoryTimelineViewControllerDelegate>

@property (nonatomic, strong) IndexMapViewController *mapViewController;
@property (nonatomic, strong) StoryTimelineViewController *timelineViewController;

@end
// IndexViewController.m

- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender
{
// if this is an embed segue then keep a reference to the view controller
// that is being embeded
if ([segue.identifier isEqualToString:@"EmbedMap"]) {
IndexMapViewController *mapViewController = (IndexMapViewController *)segue.destinationViewController;
[self setMapViewController:mapViewController];
[self.mapViewController setDelegate:self];
}
if ([segue.identifier isEqualToString:@"EmbedTimeline"]) {
StoryTimelineViewController *timelineViewController = (StoryTimelineViewController *)segue.destinationViewController;
[self setTimelineViewController:timelineViewController];
[self.timelineViewController setDelegate:self];
}
}

The prepareForSegue: method is called when the view controller should prepare for a certain segue. When using segues this is your view controller's opportunity to do something before the app segues to a new view controller. Typically, this method is used to pass variables to view controllers you're transitioning to. However, it can also be used to obtain a reference to any child view controllers that you've added using container view controllers in your storyboard. The code here simply looks for the segue identifiers for the 'embed segues' used to embed the map and timeline view controllers. It then casts each segue's 'destinationViewController' to the type I know it must be before storing a reference to it and setting the IndexViewController as the delegate.

The final step is to implement the map and timeline view controller's delegate methods:

// IndexViewController.m

- (void)indexMapViewController:(IndexMapViewController *)controller didSelectStory:(Story *)story
{
[self.timelineViewController selectStory:story];
}

- (void)storyTimelineDidSelectStory:(Story *)story
{
[self.mapViewController selectStory:story];
}

This might seem a little more complex than simply allowing the map view controller to know about the timeline view controller and vice versa, but this approach offers the flexibility to use these components independently of one another. Creating the iPhone version of the app was straightforward as a result.

The iPhone app uses the same map view controller, but   this time it is contained within its own tab.

Linking to the App Store nicely by Jake MacMullin

You may have occasion to want to send people to Apple's App Store from within your app. Perhaps you're recommending another app or the user has indicated she'd like to rate your app. If you find yourself in this situation, here's how you can do it nicely.

In iOS6 Apple introduced an API for displaying an app store page from within your app. This is much nicer than using a URL that causes your app to close and the App Store app to open and it is much, much nicer than using a URL that causes Safari to open momentarily before causing the App Store app to open.

Here's what you need to do. 

1. Add the StoreKit framework to your project. 

2. Initialise an SKStoreProductViewController, configure the delegate and ask it to load the details of the product you want to show to your user. 

 

SKStoreProductViewController *storeViewController = [[SKStoreProductViewController alloc] init];
[storeViewController setDelegate:self];

NSDictionary *productParams = @{ SKStoreProductParameterITunesItemIdentifier : @"401778175" };
[storeViewController loadProductWithParameters:productParams completionBlock:^(BOOL result, NSError *error) {
if (result == YES) {
[self presentModalViewController:storeViewController animated:YES];
} else {
// handle the error
}
}];

3. Finally, make sure you implement the SKStoreProductViewController's required delegate method.

 

- (void)productViewControllerDidFinish:(SKStoreProductViewController *)viewController {
[viewController dismissModalViewControllerAnimated:YES];
}

That's it. Now when your user has finished looking at the app you're showing her she can return to whatever she was doing in your app with a single tap.