Building eyeBeacon

eyebeacon_cover-1024x751

At PennApps over President’s Day weekend, my teammates and I set out to build something to bring hyper-localized, contextual information to Google Glass and Pebble. Call it “not quite AR” – the idea being that when a user explores a new place, their devices should passively inform them of interesting facts, additional media (photo/videos/audio), and the option to buy things, all without having to open any apps, scan any QR codes, or manually invoke said content in any way.

Bluetooth LE (BLE) is a standard that was introduced to the Bluetooth spec version 4.0. It offers lower power consumption while maintaining similar range to standard Bluetooth. Starting with Android 4.3, Bluetooth LE is baked into Android. Unfortunately, Google Glass runs Android 4.0.4 (an issue we had to overcome – read on to find out more). The predominant standard for Bluetooth “nodes” that can be used to locate a user with a high degree of accuracy (including indoors) is Apple’s iBeacon. Fortunately, this protocol is open and can be utilized by Android and iOS alike. For our hack, we used several Raspberry Pi units with USB Bluetooth adapters, running Raspbian and the Bluez Bluetooth stack. Each beacon was configured to broadcast an iBeacon “advertising” package, which consists of a UUID (commonly used to associate a beacon with a particular app/service/organization), a Major ID (typically used to identify a particular location – store location, branch of museum, etc.) and a Minor ID (used to identify this particular beacon). Within a few hours, I had all the Raspberry Pi devices up and running, broadcasting unique iBeacon data. For a quick look at the iBeacon configuration files and the scripts used to automatically start the broadcast on boot, take a look at this Gist.

The next step was finding these beacons from the user’s device. Since Google Glass runs Android 4.0.4 which lacks Bluetooth LE support, the solution was to locate beacons from the user’s smartphone and push the corresponding content to Glass and Pebble. Since a smartphone app was a necessity to include Pebble support in the first place (apps on Pebble are fairly simplistic, and can’t use Bluetooth directly) we felt this was a good compromise. We developed an Android service to run in the background that would scan for and locate beacons, averaging the distance measurements over five cycles to accurately ascertain which beacon was closest if several were detected. We leveraged Radius Network’s Android iBeacon Library. The Android app also features an onboarding process that allows the user to one-click sign in with Google+ to authorize our app to access the user’s Glass via the Mirror API, and to sign in with Venmo to allow mobile payments to occur seamlessly through Glass. The app allows the service to be started and stopped on demand, and a persistent notification indicates if the service is running and when a beacon has been detected.

Once the Android app locates a beacon, a request is made to our App Engine server containing the Major and Minor IDs of the beacon along with the user’s ID and token. The service utilizes the Mirror API to push the bundle of cards corresponding to that beacon to the user’s Glass. Mirror API was far and away the simplest way to accomplish this; since Bluetooth scanning on Glass was out of the picture, there was not a need to build a full GDK application to support this functionality on Glass. Richer experiences could be provided through a native app on Glass but for this hack, all of the media types we wanted to support are handled by the Mirror API quite nicely. The web service also sends a subset of this content back to the Android app so that it can be pushed to the user’s Pebble. All of this typically happens in 8-10 seconds after the user arrives at a beacon.
We identified three “templates” for the kinds of data we might want to push to the user’s Glass – an “intro” bundle which includes a cover card about the destination, with 3-5 highlighting exhibits (each of which can include a photo or picture), and “info” card which includes a smaller photo, text, and video; and a “payment” card which includes an image item and purchase price. Venmo payments are handled server side; once the user has signed in and authorized our app through the welcome flow on Android, we retain their auth token on the web side for future payments (Venmo’s tokens are good for 60 days and can be renewed with a simple request). The “payment” card features a “pay” context menu option which initiates the payment. After the payment is finished, the card is replaced with a new card indicating the outcome – payment success or payment failed. This process happens within seconds.

The demo video highlights these different content styles, including videos, photos, text, and payments. I think the possibilities of hyper-localized data are endless and leveraging Bluetooth LE is a very cost effective and reliable way to provide this experience in a store, museum, theme park, school campus, or anywhere a user can benefit from relevant data on Glass and Pebble to augment their experience. We will be further developing this idea in the future to make it more power efficient and to bring new types of localized Augmented Reality to Glass.

The source for the Android companion app, which performs BLE scanning and connects to the eyeBeacon web service, is available on GitHub.

  1 comment for “Building eyeBeacon

Leave a Reply