Acrossair on the iPhone

It looks like the iPhone OS 3.1 is going to do nothing more the open up the video feed on the camera so that you can overlay data on top of that video. In essence, the Augmented Reality is using your iPhone’s video as a “desktop” picture and placing items on top of that. Acrossair’s iPhone App, Nearest Tube uses the OpenGL libraries to skew and distort that data as you point the camera in different directions, thus providing a little more of a 3D perspective than say something like Layar which I have talked about previously on this blog. Chetan Demani, one of the founders of Acrossair also points out going forward any company making AR type apps will need to utilize existing location information and pre-load all the data they want to display. So the nirvana of just-in-time downloads of location data to overlay on your iPhone video image is not here,… and may not be for a while. What will differentiate the software producers though is the relevancy, and accuracy of their location information. So there will be some room for competition for a quite some time.

He went on to say that it’s pretty simple to do AR applications using the new 3.1 APIs, due out in September. ” It’s a pretty straightforward API. There’s no complexity in there. All it does is it just switches on the video feed at the background. That’s the only API that’s published. All we’re doing is using that video feed at the back. It just displays the video feed as if it’s a live camera feed.

via Augmenting Reality with the iPhone – O’Reilly Broadcast.

Advertisement

Posted

in

, , , ,

by

%d bloggers like this: