Building eyeBeacon

eyebeacon_cover-1024x751

At PennApps over President’s Day weekend, my teammates and I set out to build something to bring hyper-localized, contextual information to Google Glass and Pebble. Call it “not quite AR” – the idea being that when a user explores a new place, their devices should passively inform them of interesting facts, additional media (photo/videos/audio), and the option to buy things, all without having to open any apps, scan any QR codes, or manually invoke said content in any way.

Bluetooth LE (BLE) is a standard that was introduced to the Bluetooth spec version 4.0. It offers lower power consumption while maintaining similar range to standard Bluetooth. Starting with Android 4.3, Bluetooth LE is baked into Android. Unfortunately, Google Glass runs Android 4.0.4 (an issue we had to overcome – read on to find out more). The predominant standard for Bluetooth “nodes” that can be used to locate a user with a high degree of accuracy (including indoors) is Apple’s iBeacon. Fortunately, this protocol is open and can be utilized by Android and iOS alike. For our hack, we used several Raspberry Pi units with USB Bluetooth adapters, running Raspbian and the Bluez Bluetooth stack. Each beacon was configured to broadcast an iBeacon “advertising” package, which consists of a UUID (commonly used to associate a beacon with a particular app/service/organization), a Major ID (typically used to identify a particular location – store location, branch of museum, etc.) and a Minor ID (used to identify this particular beacon). Within a few hours, I had all the Raspberry Pi devices up and running, broadcasting unique iBeacon data. For a quick look at the iBeacon configuration files and the scripts used to automatically start the broadcast on boot, take a look at this Gist.

The next step was finding these beacons from the user’s device. Since Google Glass runs Android 4.0.4 which lacks Bluetooth LE support, the solution was to locate beacons from the user’s smartphone and push the corresponding content to Glass and Pebble. Since a smartphone app was a necessity to include Pebble support in the first place (apps on Pebble are fairly simplistic, and can’t use Bluetooth directly) we felt this was a good compromise. We developed an Android service to run in the background that would scan for and locate beacons, averaging the distance measurements over five cycles to accurately ascertain which beacon was closest if several were detected. We leveraged Radius Network’s Android iBeacon Library. The Android app also features an onboarding process that allows the user to one-click sign in with Google+ to authorize our app to access the user’s Glass via the Mirror API, and to sign in with Venmo to allow mobile payments to occur seamlessly through Glass. The app allows the service to be started and stopped on demand, and a persistent notification indicates if the service is running and when a beacon has been detected.

Once the Android app locates a beacon, a request is made to our App Engine server containing the Major and Minor IDs of the beacon along with the user’s ID and token. The service utilizes the Mirror API to push the bundle of cards corresponding to that beacon to the user’s Glass. Mirror API was far and away the simplest way to accomplish this; since Bluetooth scanning on Glass was out of the picture, there was not a need to build a full GDK application to support this functionality on Glass. Richer experiences could be provided through a native app on Glass but for this hack, all of the media types we wanted to support are handled by the Mirror API quite nicely. The web service also sends a subset of this content back to the Android app so that it can be pushed to the user’s Pebble. All of this typically happens in 8-10 seconds after the user arrives at a beacon.
We identified three “templates” for the kinds of data we might want to push to the user’s Glass – an “intro” bundle which includes a cover card about the destination, with 3-5 highlighting exhibits (each of which can include a photo or picture), and “info” card which includes a smaller photo, text, and video; and a “payment” card which includes an image item and purchase price. Venmo payments are handled server side; once the user has signed in and authorized our app through the welcome flow on Android, we retain their auth token on the web side for future payments (Venmo’s tokens are good for 60 days and can be renewed with a simple request). The “payment” card features a “pay” context menu option which initiates the payment. After the payment is finished, the card is replaced with a new card indicating the outcome – payment success or payment failed. This process happens within seconds.

The demo video highlights these different content styles, including videos, photos, text, and payments. I think the possibilities of hyper-localized data are endless and leveraging Bluetooth LE is a very cost effective and reliable way to provide this experience in a store, museum, theme park, school campus, or anywhere a user can benefit from relevant data on Glass and Pebble to augment their experience. We will be further developing this idea in the future to make it more power efficient and to bring new types of localized Augmented Reality to Glass.

The source for the Android companion app, which performs BLE scanning and connects to the eyeBeacon web service, is available on GitHub.

Wow Such Roadtrip

such-web-1024x568

Such Roadtrip helps you create a scrapbook of your trip automatically with photos, tweets, facebook updates, and much more. Roadtrip will automatically organize all your activities and plot it on the map in context of your road trip, so that you can relive the experience again.

My team and I built Such Roadtrip in a weekend at Hacktech 2014 in Santa Monica. We won a Digital Ocean API prize worth $2,000!

Such Roadtrip is an Android app and a web service using node.js and MongoDB.  The source for the Android app is available on GitHub.

screenshot3-250x486 screenshot2-250x486 screenshot1-250x486

From OC to Las Vegas in a Model S for CES 2014

 

“Range anxiety” is a term that’s been tossed around a lot over the years as EVs struggled to reach the mainstream due to limited range and a lack of charging infrastructure. But Tesla has solved these problems. A road trip from Orange County, CA to Las Vegas for CES 2014 proved that road trips in a Tesla are easy, enjoyable, and best of all, don’t cost a cent thanks to Tesla’s fast and free Supercharger network. Our Tesla has the 60kwh battery, the smaller of the two packs Tesla offers – but regardless, making the trip was no problem, and we closely matched (and in some cases exceeded) EPA estimated range the entire way despite the major elevation changes on the route. Here’s the breakdown:

  • Left Orange County with 202 rated miles (almost a full charge, which would be around 208 for the 60kwh car).
  • Arrived at the Barstow Supercharger with 97 (actual distance 102 miles, rated range used: 105 ).
  • Left Barstow with 200 rated miles, arrived in Las Vegas with 39 (actual distance 157 miles, rated range used: 159).
  • Left Las Vegas with 198 rated miles, arrived in Barstow with 43 (actual distance 157 miles, rated range used: 155).

The weather was perfect – it was cool outside, so AC usage was minimal, with just a fan blowing to pull in fresh air. Perhaps a trip in the hot summer months would have required AC usage that would have increased energy consumption, but the exact same is true in an ICE-powered car.
Since the Las Vegas Supercharger was still under construction during our visit, we made do with charging at the Tesla Service Center in Las Vegas, a few miles south of the strip, and topping up using a 110v plug at our hotel. The availability of the Supercharger in downtown Las Vegas will make this trip even more seamless in the future.

The Model S is an awesome road trip car and made this trip even more memorable.

Photo credit: Rudy Pesci