Apps That Know Where You Are: Our Experimentation With Apple's iBeacon Technology
Introduction to the Lab Program:
Earlier this year, The Nerdery unveiled its Nerdery Labs program. It’s an opportunity for employees to submit ideas for projects demonstrating cutting-edge technologies. Those Nerd’s ideas which show the most potential are given a week of time to pursue it and produce something to show to other Nerds and the world at large.
I have a strong personal interest in extending user experiences beyond the bounds of traditional mobile apps by interfacing with external technologies. I saw the Nerdery Labs program as the perfect opportunity to pursue that interest…so I submitted a proposal to show the possibilities of Apple’s new iBeacon technology. I was tremendously excited when I heard my idea had been selected and as soon as I wrapped up the client project I was engaged with, I got to work!
Introduction to iBeacon
Buried in the ballyhoo surrounding the radical visual changes to iOS 7 was an all-new technology introduced by Apple: iBeacon. What it lacks in razzle-dazzle, it more than makes up for in enabling entirely new interactions and types of applications!
It is important to understand that iBeacon is not a device or a new piece of hardware like the TouchID thumbprint scanner. Instead, it is a public protocol or “profile” built on top of the Bluetooth LE (Low Energy) technology which has been present for several years in iOS devices: iPhone 4S and later, iPad 3rd Gen and later, and the 5th Gen iPod Touch. Bluetooth LE was released in 2010 as a lower-power, lower-speed alternative to traditional Bluetooth; devices broadcasting infrequently using Bluetooth LE can run for up to two years on a single coin-cell battery. Any device that announces itself using the iBeacon profile is an iBeacon, whether it is a small, dedicated radio device or an iDevice configured to broadcast as an iBeacon. Apple will not be producing any dedicated iBeacon hardware – that will be left to third parties. Android support for Bluetooth LE was added in 4.3 (Jelly Bean) so there will likely be Android iBeacons in the near future, too.
Figure 1-1. How iBeacon works
At its core, iBeacon is simply a “HERE I AM!” message broadcast roughly once per second to other devices within range of the Bluetooth radio (Figure 1-1). It has a few identifying characteristics so that apps can distinguish the iBeacons they’re interested in from a crowd. iBeacon broadcasts have no data payload; they simply identify themselves via a UUID (unique identifier) and 2 numbers, dubbed “major” and “minor”. You can think of the UUID as the application identifier: each app will use a different one (or more). An app can only listen for specific UUIDs provided by the developer, there is no way to see a list of all iBeacons visible to the device. The major and minor numbers have no intrinsic meaning, they are available for the app to use as the developer sees fit. A common scheme is to designate the major number as the general region and the minor as a specific location within that region. As an example, in an app for Macy’s, the UUID for all iBeacons in all Macy’s stores would be identical. The major number would refer to a particular Macy’s store (22 = San Francisco, 1 = NYC, etc.). The minor number would represent the different departments within the Macy’s store (14 = Women’s Apparel, 7 = Bedding, 29 = Men’s Shoes, etc). The numbers represent whatever what you decide as you plan out the app. The point is, major and minor could be used to identify more than just physical locations; people, pets, containers, kiosks, luggage, and many other objects that you want to keep track of as they are on-the-go could benefit from the technology.
One of the first challenges I encountered as I set out to build an iBeacon testbed app was a lack of online resources: iBeacon was a completely new technology and was covered by Apple’s developer NDA until the public release of iOS 7. This meant no Stack Overflow answers and no blog posts…I was alone with Apple’s sometimes-confusing documentation and a single WWDC video. While the NDA has since expired and more resources have come online, it can still be tricky to find solid answers to problems encountered due to the newness of the technology.
The second obstacle that became apparent was the physical necessity of having two or more iOS 7 devices to test with: one or more to broadcast as an iBeacon and one to receive the signals and act upon them. Unlike most other iOS features, iBeacon functionality cannot be tested in the simulator. While some dedicated iBeacon devices existed prior to iOS 7’s release, they were relatively expensive and cumbersome to configure. Since we needed some iOS 7 devices to test apps for iOS 7 compatibility in the App Store, a portion of our device pool was updated and I was able to snag unused devices for testing.
Once I really got into developing the app, it became clear that one of the biggest hurdles to working with iBeacons is organization. A widely deployed iBeacon solution for a national or international client could potentially involve hundreds or thousands of iBeacons. How do you keep track of what the over 4 billion possible combinations of major and minor value mean? My solution was to first define how I was going to use the major and minor numbers. I chose major to represent the various locations at The Nerdery plus an extra region for help requests (more on that later). In my scheme the following regions were defined:
- 0 = Unknown
- 1 = Bloomington
- 2 = Chicago
- 3 = Kansas City
- 4 = Other
- 999 = Help Request
I decided that I wanted more specificity about what type of location or resource each iBeacon represented, so I broke the minor version down into groups of 1000:
- 0 = Unused
- 1000 = Entrances
- 2000 = Conference Rooms
- 000 = Departments
- 4000 = Groups
- 5000 = People
- 6000 = Facilities & Services
Within each group of 1,000, each individual iBeacon would be assigned an ID from 0 - 999. So, in this scheme, the front door is 1,001 while our UX department is 3,004. I designed a simple JSON data structure to represent organizational structure and allowing me to associate a human-readable label with each region, beacon type and individual beacon. Thus, when my final app encounters a beacon broadcasting major:1 / minor: 2,009, I can display “Bloomington Office - Conference Room: Mordor,” the conference room one does not simply walk into.
I had a good reason to put the data into JSON format: I intended to host it on Firebase, one of the many new “back-end in a box” services springing up online. By hosting the configuration online and downloading it at runtime, I am able to effect changes to the names of existing iBeacons or define new ones without re-compiling the app – an essential capability for enabling the addition of new iBeacons down the road. It also allowed me to create a companion “iBroadcast” app which would use the data to allow me to quickly configure an iOS device to broadcast one of the defined iBeacon signatures. This saved a tremendous amount of time during development by no longer forcing me to manually type in the major and minor values every time I wanted to reconfigure an iBeacon.
Making It Fly
With all that in place, interacting with the iBeacons is actually quite easy. Listening for iBeacons involves setting up a filter for the app’s UUID: all beacons; only those with a particular major value (any minor value); or only those with a particular major and minor value. Once I established that I was interested in all beacons for my UUID, I turned on beacon detection (called “ranging”) and handled Core Location’s updates. The updates, provided roughly once per second, list all iBeacons that the device has detected that match your app’s UUID and filter requirements. My app simply sorted the list to determine which iBeacon the device estimated was closest and displayed the information for that beacon.
Once that was up and running, I decided it would be fun to add a “Message of the Day” functionality. This was simply another block of JSON stored in Firebase – a beacon identifying string (“1-2-9” for my Mordor conference room example above) and the HTML message to display. This could be used for a variety of purposes like showing the daily schedule for a conference room, reminding employees that today was free salad bar day in the kitchen, or asking visitors for a Tech Talk to sign in at the front desk. If no message existed for a particular iBeacon, the app simply hid the MotD field.
How Distance Works with iBeacon
iOS does a lot of work behind the scenes approximating the distance to each beacon. Because broadcast interference can result in wide fluctuations in signal strength, iOS smooths the data to produce a more stable estimate of distance. The estimates of distance are further broken down into four predefined distance zones: unknown, far, near, and immediate (Figure 1-2). When a beacon cannot be detected, either because it is too far away or because it has been switched off, it falls into the unknown zone. From the outer detectable range of around 30m down to roughly 2m, the iBeacon is in the “far” zone. From 2m down to roughly .5m (18 inches), the iBeacon is classified in the “near” zone. Closer than .5m results in the iBeacon being in the “immediate” zone.
Figure 1-2. Distance categories for iBeacon
A Little Help Here…
The other proof-of-concept idea that we thought up was the scenario of a visitor at an event or a shopper in a store needing assistance: it can be frustrating to track down an employee and lead them back to where you had the question. Why not have the employee come find you? Because most iOS devices can be configured to broadcast as iBeacons, we can have a “helper” device look for a “visitor” device that is broadcasting a virtual help request.
It starts with a person using the visitor mode tapping a button reading “I need help!” The visitor has the option to add their name to the request, or remain anonymous. The visitor’s app picks the Help Request region for the iBeacon major number and a random number for the minor number and stores the name (if provided), the chosen minor number, and the last iBeacon (if any) that the visitor’s app detected as closest in a new help request data object on Firebase.
This help queue is constantly being monitored by someone running the “helper” mode of the app. When a new help request is added, the helper is notified that someone needs help. The helper acknowledges the request, updating the status on Firebase and causing the visitor’s phone to immediately display a message that help is on the way. From there, the helper can use the visitor’s last-known-iBeacon as a starting point and is provided distance feedback as they approach the visitor. Once they are close enough to identify the visitor who sent the request, they can mark the request complete and provide help.
Where To Now?
This is a very basic implementation of the idea, but it could be greatly expanded with two-way messaging, metrics measuring response time for the helpers, success rate, etc. It is tremendously exciting to think up new interactions that iBeacon enables that simply haven’t been practical or even possible before. For the first time, we have a simple means of very precisely determining a user’s position in 3D space, even while indoors, without rapidly draining their battery through constant GPS usage.
In fact, iBeacons have the potential to make a tremendous impact on all kinds of transactions where location is important. Some other potential applications:
- Accurate user location for “check-in” functionality in loyalty programs.
- Offering users promotions or discounts on products they’re actually standing close to (not giving them a Plasma TV promo when they’re clearly looking at refrigerators).
- Accurate in-store mapping, allowing users to navigate to the product they’re looking for within the cavernous space of a big-box store.
The next generation of in-store experiences will be heavily influenced by this technology. It has been rumored that Apple's very own retail locations will be blanketed with iBeacons for store displays, making reservations, getting sales assistance, and more. We are looking into the future of contextual interactions with technology. Giving your second-brain (smartphone) an awareness of where you are and context about your situation (which App you have launched) we as developers can create experience-rich applications. iBeacons are just the beginning as wearable technology is just now coming into vogue and more of what do each day is influenced by technology and what it is capable of.
Published on 11/19/2013