Having a compass and camera built into the iPhone opens up possibilities for augmented reality (AR) on the iPhone. P Derek Smith of SimpleGeo is a developer of their AR framework. He's going to step you through the challenges of providing a realtime overlay on the physical world. Nicola Radacher of Mobilizy will discuss what they've learned in the four releases of their popular Wikitude.
The camera and associated APIs on the iPhone are easy to get started with, but it is tricky to unlock the full potential. Occipital co-Founder Jeffrey Powers discusses getting started with the camera, and then will dive into advanced tricks that Occipital uses for photo capture in Snapture and for barcode recognition in RedLaser.
The iPhone includes several ways to determine your location: cellular tower triangulation, Wi-Fi-based positioning using Skyhook Wireless's system, and GPS. Alok Deshpande from Loopt, will discuss what they learned about building location-aware apps. Alok will be followed by Nick Brachet, who will give an overview of the Skyhook system and how it uses WiFi to provide location to applications.
The iPhone, like a lot of high-end smart phones these days, comes with a number of sensors: camera, accelerometer, GPS module, and digital compass. We’re entering a period of change, more and more users expect these sensors to be integrated into the "application experience." If you application can make use of them, it probably should.