Monday, September 12, 2011

A new set of ears and eyes

Over the past couple of months, I have noticed a series of iOS applications which utilize the camera attached to a phone or iPad that bring order to the 'chaos' around me. In each of the applications the image taken by the camera is compared against a specific database of objects (or GPS location) to give search results without ever invoking a browser. For example:

Zagat updated their search capability with an eye icon on the top of the application. If you scan your camera around a street, up comes the summary restaurant rankings, with no clicks, and no 'formal' search invoked

Fooducate uses bar codes to give you nutritional rankings, and suggestions, for packaged items

Skin of Mine uses pictures of your skin (moles, spots, etc) from your camera to search against their database of  skin disorders and either analyzes the issue, or offers to put you in touch with a professional for a consultation. Nice lead generation for the derm folk, and a streamlined process for the consumer

LeafSnap, referred by my farmer friend Ron, enables you to identify various trees by snapping photos of their leaves.

Google Shopper, is a mobile search tool that relies on a camera for searches on items outside of their 'Featured items'. For some items, such a books, no bar code is required, as it scans the image of the cover and matches it against their database. It's really amazing to see it in action and it's GPS abilities increases its relevance. eBay also lets you search for scanned products via RedLaser (acquired last year) or its own application.

I suspect that camera and/or voice enabled applications such as Siri (acquired by Apple) or Dragon Go! are now well along the way of fundamentally changing a user's interaction with their computer/phone/tablet. From past experience, we know that when the user interface substantially changes, such as with Windows, various browsers, or the iPad, tectonic market shifts follow.