A couple of months ago I got one of the new htc droid phones. I’ve had smart phones before so I’m already familiar with how much power they can pack into these things but I’ve never had one with a GPS. So, naturally, as soon as it was out of the box I was cruising the Android Marketplace looking for whizbang apps that would take advantage of the all capabilities built into my shiny new phone and ran across this thing called Layar.
Layar (http://www.layar.com) is an Augmented Reality app that takes digital data and overlays it on the live camera view of the phone. Now that sounds easy enough but in order for it to work there’s a lot going on behind the scenes - the phone has to know where it is in 3d space, it has to know which way is north and it has to know which way it is facing. Using the camera as an ‘eye’ is cool too but that pales in comparison to referencing the earth’s magnetic field to determine direction. Cool, right?
Augmented Reality is also cool but it isn’t new, we see it all the time on television and in sci fi, the first and ten line on a football field or the terminator’s targeting overlay are AR but it wasn’t until I got my hands on this phone that I could do it myself. And I don’t mean just viewing the data either, Layar has an open source API and actively encourages people to create their own layers and publish them out for the world to see. So, when I was selected to participate in GIS Inc’s R&D Award program and pick my own development project, guess what I chose.
There’s a lot of Layar layers out there but they’re mostly point locations – starbucks, houses for sale, restaurants, etc. But if it can do points, in 3d space, and it can do custom models, pictures, movies, sounds, etc why not polygons? And for that matter, why not 3d polygons? The documentation and some of the samples I saw were encouraging so I decided to give it a go.
Step One – the data. There’s a lot of public domain data out there so I started poking around looking for vector polygons, they didn’t have to be 3d, I can extrude them up based on an attribute but I did want real world GIS data, and a lot of it too, not just a couple buildings I sketched in by hand. What I settled on was NRCS Soils data, it has a wide coverage area and is complex enough to make a good test. It’s very complex in fact, enough so that the first thing I did to it was generalize it using a 5 meter tolerance and dice it so if a polygon was over 500 points it was cut into smaller ones. That’s cheating a little but to be fair I did load three entire counties worth of soil polygons (Pulaski Co, VA and Jefferson/Shelby Counties, AL) and published them using ArcGIS server.
It turns out that while Layar can display polygons, models, etc it still thinks in terms of points. A feature is shown depending on the users distance to that feature’s ‘hotspot’and is drawn using offsets from that point, not xyz coordinates. So, I created a point feature class using the Create Random Points tool in ArcGIS to ensure that I have at least one point in every polygon (more on that in a minute) and used Spatial Join to assign the polygon attributes to the points inside. Each point has a unique ID as well as the ID of the parent polygon and it’s attributes. It would be possible to create the points dynamically without creating a feature class, for example you could use the closest vertex to the user’s position as the hotspot, but that slows down the map request since you’d be querying polygons instead of points and could cause problems when caching the models.
Step Two – the web service. When you set up a Layar layer the way it works is you register a url and when someone loads your layer their browser sends a request to that address including things like the users location and any optional filters you want the user to have control over (range, transparency, etc). Usually this is a simple database query against a list of points and coordinates but for this project, the web service makes a call to ArcGIS using the REST API. This allows me to filter the results by distance from the user before sending them to Layar. The Layar Browser has a distance setting so it does some filtering of it’s own but for performance reasons Layar recommends returning no more than 50 POIs (points of interest). The query results are then formatted to match Layar’s specs and are returned as a JSON response.
As I mentioned before, Layar thinks in terms of points. For our project, this means each polygon is represented by at least one point and is actually a model that is drawn using offsets from that point. Layar checks the distance from the phone to the features hotspot to determine if it should be displayed or not, so if you have a long polygon with a hotspot on one end and the user on the other it won’t show up correctly. This is why the Create Random Points tool was used with tolerances set so a large polygon would have more than one hotspot.
Step Three – building models. Given a ring of 2d coordinates it’s not hard to make them into walls, just add the height to the z coordinate and loop through. For this example the height is pulled from the polygon’s slope attribute since it provides a good range of values. It’s also not hard to represent the walls with triangles either, which it turns out you have to do to create a renderable face, simply add a diagonal to each section of wall. It is tricky, however, to cover an irregular, concave polygon with a top made of triangles (google Triangulation, there’s people making a career out of it). The model has to be in Layar’s .l3d encoded, binary format so you have to get the specs from their support folks, who are surprisingly helpful. And finally, the model has to have a colored semi-transparent png assigned as the texture so you can see though it or it spoils the effect. For this example I used random colors but this could easily be based on an attribute. The .l3ds are stored on the webserver and the JSON response to Layar includes a link to it for each hotspot in range of the user. Since we have to write these files to the server anyway, using the feature id of the hotspot as the filename allows us to use cached copies rather than recreating the same model more than once.
Step Four – launch! Layar has a test site and you can allow access to selected users before you go live, they just have to register and create a password. Once your layer is working and tested you submit it for review before it is published, the Layar support team checks each new layer to make sure it returns results. There are a number of options for setting up layers and the overall look and feel but they’re pretty easy to step through and experiment with.
Try it out – There’s a number of ways to view this layer on your android or iphone (after downloading and installing the Layar app).
- Search for ArcGIS Soils in the Layar browser.
- Use this link (http://m.layar.com/open/soils) from your mobile device’s web browser.
- Install this .apk (http://layar.gisinc.com/Layar/3dSoils.apk) on your android phone, it will open Layar and load the soils layer with one click.
For those in Pulaski County Virginia or Jefferson/Shelby Counties in Alabama you’ll need to make sure the GPS on your phone is enabled. If you are outside of the test area you can still load the data by opening the Layar app, expanding the ’Outside Test Area?’ option and checking the override GPS toggle. This will allow results to be shown on your phone although the data will be shifted to match your location.
If you don’t have one of these awesome phones, here’s an idea of what it looks like in action although it’s pretty choppy due to the refresh rate: