One of the protagonists of the novel Halting State, by Charles Stross is Sue Smith, a police sergeant in Edinburgh in the year 2018 or so. Whenever Sergeant Smith is on duty, her eye-glasses are logged in to "CopSpace." Through these networked glasses, she can see the world in its usual glory and squalor, but the glasses add an overlay of police information. When she looks at a building, she sees a summary of the crimes committed there hovering near the front door. If she is called to a crime scene, she can see a big 3D diamond floating over the scene from blocks away to guide her to her destination. And suspects? When she fixes them with her steely gaze, their faces crawl with bad credit ratings, parking tickets, unpaid taxes, and other, darker secrets extracted from the police databases.
Sergeant Smith's fancy goggles represent one science fiction version of augmented reality. If you search the web for augmented reality here in 2009, you will quickly find photographs of researchers wearing amusing headgear connected to computers in backpacks. To reach towards the goal of augmented reality goggles, these researchers face some daunting challenges, including the geometric and computing speed problems associated with tracking the position of the wearer's head quickly and accurately enough to prevent seasickness as the augmented images trail behind the real world by a fraction of a second.
Another approach to augmented reality, however, is to forget about the (admittedly very cool) idea of visually overlaying data on any object you happen to be looking at the time. Instead, perhaps we can just bring a mobile phone with us, and consult its location-sensitive software for information about the world around us. For example, imagine taking an iPhone around the Carleton campus. Stand in front of Goodsell Observatory, and the iPhone might tell you about Carleton's role in the history of railway time-keeping, or this month's schedule of Friday-night sessions with the big telescope, or the office hours of the Linguistics department. The phone would be augmenting reality in interesting and useful ways, and you wouldn't have to bring a vomit bag along for the trip.
For this project, you will use the location-aware features of a mobile phone (iPhone or G1, yet to be determined) to develop a simple augmented reality system, focusing on Carleton's campus. In particular, you will design and implement:
A well-designed and well-executed system of this sort could be shared with visitors to Carleton who happen to own the right kind of phone. Once you have a prototype, we might want to consult with the Admissions Office, for example, to find out whether they would find such a system useful.
In the fall, you'll work with a librarian to do a thorough literature search to find out what others have done in this area. In the meantime, here are a few relevant links.
The Augmented Reality Games project at MIT offers several interesting examples of phone-based augmented reality.
Apple's Core Location framework for iPhone provides an example of the sort of programming tools available for writing location-dependent software.
Wikipedia's page on Geographic Information Systems talks about the layering of data, and points to other resources on that topic.