Baseball cap head-up display

For a while now I’ve been interested in augmented reality. Working in the field of geospatial science, I’m always looking for better ways to present location-based information, and augmented reality shows a lot of potential. Plus I’m a big fan of science fiction, and AR interfaces have featured in some of my favourite movies – The Terminator (1984) and Robocop (1987) – as well as some of my favourite books – Neuromancer (1984) and Virtual Light (1993).

But although the idea has been a part of popular culture for nearly three decades, the reality has been disappointing. Head-up displays have long been used by military pilots, and are starting to be deployed in luxury cars, but personal units have been gimmicks at best. There have been augmented reality motorcycle helmets and ski goggles, and Google is rumoured to be releasing Google Glasses by the end of the year, but none of them provide a full field-of-view display. This limits their ability to provide fully immersive graphics or even to play movies.

So I decided to build my own. How hard could it be? Not very, it turns out.

I put one together using a smartphone (an HTC Desire Z), a baseball cap, a couple of fresnel lenses, a plastic mirror, some mirrored film from an iPhone screen protector, and assorted office supplies. The smartphone is held in place with a rubber band, and whatever appears on the screen is displayed in the wearer’s field of view. The fresnel lenses are there to push the screen’s perceived distance out to about 50cm, otherwise you’d get serious eye strain.

It works pretty well. Lining up the reflective screens is fiddly, so future designs should lock them in place. They also don’t work too well in direct sunlight, since the mirrored film isn’t very reflective, so the reflected light from the screen is pretty dim. Ideally you’d want some sort of mirror that automatically adjusts to the ambient light, and you’d want the smartphone screen to dim at night, but these work fairly well indoors.

The image quality from fresnel lenses is mediocre at the best of times, but I was forced to use a pair of these in series to get the focal length short enough. As you’d expect, that creates a fair bit of distortion at the periphery, especially since the mounting isn’t very rigid. It would be better to use a single, rigid lens, with just the right focal length – preferably a bit shorter than my current setup, since the 50cm screen distance is a bit too close to be comfortable. I suspect that the best solution would be to do away the fresnel lenses altogether and use curved mirror-tinted surfaces to display the image, but that’s beyond my design and manufacturing ability.

My HTC is fairly heavy with its slide-out keyboard, and with all that weight on the end of the bill you need to put the cap on nice and tight to keep it in place. Something a bit lighter, like an iPhone, would probably be more comfortable.

But it works as a proof-of-concept. It’s pretty cool to run Wikitude with some duct tape over the camera lens, and see all the points of interest rotate around as you turn your head. Now all it needs is some custom software so I can see where I’m going while I read my e-mail.

Taking pictures of an inside view of the display was tricky. In the end, I turned the cap upside-down, stuffed a t-shirt in it, and rested a manual-focus camera on top. The resulting pictures were taken through an upside-down display, but you get the idea.


Android X server

For the past few months I’ve been implementing an X11 server to run natively under Android. In the near future I may have need for a serializable user interface, so to get a better understanding of how they work I decided to implement the de facto standard, X11.

Well, it turns out the X protocol is bigger than I thought, but through sheer bloody-mindedness I got it finished. And it might actually be useful.

I had assumed that all internet-enabled smartphones would be sitting behind NAT-ing routers, both for security reasons and to conserve IPv4 addresses. But no, on the “3” network in Australia at least, phones all have externally-accessible IP addresses, meaning they can run servers. So you could potentially launch a Linux X application out in the cloud and have it display on your phone.

The user interface is fairly simple: touch the screen to move the pointer, and use the directional pad to activate the left/middle/right buttons. Update: the volume up/down buttons now work as mouse left/right buttons. Both virtual and physical keyboards are supported.

The source code is available at http://code.google.com/p/android-xserver/ https://github.com/mattkwan/android-xserver/ under an MIT licence, and the application (called X Server) is available for free through the Android Market Play Store.

There are a few parts of the X protocol it doesn’t implement …

  • Dynamic colourmaps. Android only supports a 24-bit static colourmap.
  • Dashed lines, tiles, and stipples. There’s no native support for these in Android, and seriously, does anyone use them?
  • Drawing operations other than Copy and Xor. That’s all Android supports.
  • Queueing keyboard and pointer events during grabs.
  • Most extensions. XGE, XTEST, BigRequests, and Shape are implemented, bit that’s it. There are hooks provided in the code, so if you’re feeling ambitious, try implementing some others. Quite a few applications use them.
  • Key click, auto-repeat, and keyboard LEDs.

The server also ships without a window manager, which is a problem because a number of applications expect one to be running. The code includes a parameter specifying an Android service to be launched once the X server is running, and this is intended to start a window manager. But first someone will have to implement a window manager in Android, and doing that properly requires a re-implementation Xlib. Not me, I’m afraid.

However, there is a workaround. Because access control is disabled by default, you can run a window manager remotely, e.g. fvwm -d xxx.xxx.xxx.xxx:0. Not very efficient in terms of network traffic, but it works.


PhD final draft complete

The last few months were spent at conferences and sitting in front of a computer knocking out the final draft of the thesis. I was working on another project over the Christmas break, but I’ll discuss that in another post. What I’m going to talk about here is the status of our Heritage Health Prize efforts.

Basically, we’ve stopped working on it. We haven’t submitted since August last year, and we’ve slipped to 12th position (although we climbed three places due to teams ahead of us merging). There are a few reasons …

  • We ran out of ideas. Simple as that. Actually, I’ve got a few ideas I haven’t tried, but they’re a lot of effort to implement and the pay-off won’t be worth it.
  • It’s not cost-effective. The US$3 million prize won’t go off, trust me. So the best we can hope for is $500,000. Shared between two people, for two year’s work, converted to Australian dollars, I’d be better off with a real job. And that’s assuming we win.
  • I’ve learned all I wanted to learn. I’ve never done data mining before, so one motivation for competing was to get up to speed on the latest techniques. Mission accomplished. I’ve now resurrected my undergrad linear regression skills and learned all about decision trees. When it comes to learning new skills I hit the point of diminishing returns long ago.
  • It isn’t useful. Probably the greatest pleasure I gain from writing software is knowing that someone will use it. I hate wasted effort. That’s why I much prefer the business world over academia. Unfortunately, due to privacy safeguards, the data provided in the competition is nothing like real world data, so the algorithms we develop will never be used in practice. That wasn’t the competition’s intention, but that’s the way it will play out.

Continuing on the last point, consider the following example. Probably the easiest hospitalization outcome to predict is childbirth. On the day a pregnant woman gets her first medical check-up, the doctor can pretty much pencil in the date she’ll need a hospital bed. Sure, some pregnancies end in miscarriage or late-term abortion, but they often require hospitalization as well.

Unfortunately, the HHP data doesn’t contain enough information to figure out the date of conception. Or to tell for certain if the patient was pregnant. Or if they had an abortion after discovering they were pregnant. You can tell when they actually gave birth (it’s a hospitalization event with a specific code), but when I tried to predict those outcomes I was wrong almost as often as I was right.

In other words, I think the world’s best data mining software, trained on crippled data, will be less effective at predicting hospitalization than a medical professional using real data. So the software will never be used, and all the effort will be wasted.