For the past year or two of AR’s short life, the technology it uses to fix the virtual layer on the real world has been kind of dodgy.
There have been two ways of doing it: geolocation, relying on GPS which notoriously has a 10metre error margin; or image recognition – nice but usually has to be paired with a graphic to well (a QR code or suchlike).
A Swedish start up 13th Lab has come up with a third way, which cunningly combines both of the above – making the virtual layer much more intelligent and sensitive to the surrounding world.
Nasa use it to control robots apparently. 13th Lab have got the system down to something they can put on an iPad app.
It uses a computer vision technique known as SLAM — that’s Simultaneous Localization and Mapping which helps a robo look around and understand where it is.
They’ve built a little game with the technology. It’s well fun and it’s called Ball Invasion (iTunes). Video above. It’s built for an iPad 2 too – a natural home for AR – having a large screen and big battery. This could be taken a lot further though. Watch this space..
Related: Ten Best Augmented Reality Apps[via gigaom]