Tonight, Google launched some cool new search features at Google's Search Event. Tech Crunch has been covering the event live, leaking some of the new stuff to be expected from Google in the near future.
One of these things is Google Goggles, which allows users to take a picture of an object and then use that image as a search query. This is obviously a perfect fit for mobile phones - users simply take the shot of the thing they want to search for, and they are then showed the Google results for the product.
According to Google, 2/3 of our brain is involved in visual processing which makes visual search quite an important feature to be considered for the future.
Some of you may know that a couple of visual search apps have already been around for some time now on the App Store and Android Market. Perhaps the currently best know app for visual search is SnapTell, available both for the iPhone and Android handsets. It allows users to take a photo of a book, DVD or video game and then displays the title of the respective item along with a list of the prices for that item on some of the best known online retailers.
Some of you may know that a couple of visual search apps have already been around for some time now on the App Store and Android Market. Perhaps the currently best know app for visual search is SnapTell, available both for the iPhone and Android handsets. It allows users to take a photo of a book, DVD or video game and then displays the title of the respective item along with a list of the prices for that item on some of the best known online retailers.
Nokia also has a similar application for its Symbian platform for some time now but I am not sure it really took off. It's called Nokia Point and Find and you can find details about it at this link or see how it works in the video below.
Google is already well positioned on all these three new types of searches. Let's briefly consider them...
Search by location to be checked very soon in the future.
According to tonight's presentation, Google's mobile homepages will soon have the capability to adjust its search suggestions to consider the user's location. As an example provided to the audience in the Computer History Museum - the venue of the presentation - Vic Gundotra showed how the same Google search for "Re" resulted in different results for one iPhone with the location set to Boston (Google top result: "Red Sox") and for another iPhone set to San Francisco (Google top result: "REI")
Before I end this post, I just wanted to highlight one more goodie from Google Labs, launched some weeks ago, but which I never got to introduce.
Google Image Swirl is a new way of categorizing searched images by considering both the visual and semantic similarities.
All that is left for me to say at this point is enjoy. And keep searching!
No comments:
Post a Comment