I recently managed to acquire a pair of Google Glass. Wearable technology by Google that was first seen almost two years ago, followed by one hell of a keynote last year at Google IO with blimps, parachutes and other awesomeness.
This article is probably the first one in a series of (not too) technical articles about my tinkering with my pair of Glass. So watch this space! Let’s go!
While I won’t detail the packaging and stuff at this point (others have done it elsewhere), there are some points worth mentioning: the glasses run on Android 4.0.4 on a 640×360 resolution and you can switch them to debug mode, so you can upload your APK’s to test them. This is probably their most awesome feature next to the navigation and picture-taking.
One of the current drawbacks is that you can’t launch apps you have installed through the debugger using the launcher, unless you install another launcher like Launchy, that uses Android’s intent mechanism to replace the default home screen (have a look at the source, by the way, it’s quite interesting!).
One of the first things I did when I had my pair last week, was make a quick “Hello, World!” app to check if this would actually work on them, and yes, it did!
In order to make the app glass friendly, I had to use a theme that ran fullscreen, like
@android:style/Theme.Holo.NoActionBar.Fullscreen, otherwise, the
ActionBar and status bar would pop up and in this form factor this makes absolutely no sense. Also, depending on the kind of app you are making, you would need to keep the screen alive, using
Well. Enough of kittens. Let’s make a more relevant app, like for example, location-based predictions of bus/tram/metro arrival times in my former STIB app (public transportation in Brussels). While I actually don’t live in Brussels anymore, this was the easiest thing to do as I just had to reuse old code to make this work. I will soon try to make one with San Francisco’s MUNI.
What is the app like if I compile it “out of the box” and run it on my Glass? Well, it’s not so Glass-friendly:
How would you navigate through such a thing? Well, that is the tricky part. You can’t really do much. The gestures on the touchpad on the right side of the glasses are passed to the Android app as key presses, and here are those that you can base your app on:
- Left (same as left key on a physical D-pad on phones)
- Right (same as right key on a physical D-pad on phones)
- Tap (same as “enter” key on a physical D-pad on phones)
- Swipe down (back button on phones, will trigger a finish() of your Activity)
All the things listed above should be enough to build a basic Glass app. In the above example, the focus would go from one focusable item to another, until I would decide to press it.
So I decided to adapt the code of one of the most relevant screens of the STIB app: the one that shows predictions for the 10 closest stops, by splitting the long list with the 10 stops into 10 fragments with a ViewPager, where each fragment would represent a stop and predictions for that stop. To go from one stop to another, a simple swipe on the side of the glasses is enough.
Here is what the prototype looks like, with the closest stop displayed with its predictions (yes, I know, 8884km is a lot, but it’s the distance from SF to Brussels) :
A swipe to the left reveals the next bus stop prediction in the list:
To adapt my code to make it Glass-friendly, it took me about 45 minutes. To write this blogpost, about an hour. It is really easy for an Android-developer to go Glass.
I am so anxious to see a real SDK for Glass: the soon-to-be-released-but-always-delayed GDK. Access to features like the camera, sensors, etc. will literally unleash the power of these devices.
Also: in case you wonder: the location is obtained through the old location API. I should test with the new ones.
I will soon adapt beCycle so it has Glass support. Imagine how cool it will be to have info about available bike-sharing parking spots while riding your shared bike? No more stopping on the side and checking on the phone, you’ll get your info on-the-go!
Next article, I will try to find a way to enter text using your voice.