Macworld’s Aayush Arya uses Google’s new voice recognition search application, and they like it a lot.
I downloaded it last night, and I like it. It works really, really well. Its voice recognition is quite good, getting all of the basic search queries correct, and the results are excellent. For example, besides searching for the standard movie showtimes, I tried searching for “weather forecast,” “Thai food,” and “oil change.”
All worked exactly as I expected. “Weather forecast” brought up current and weekly local weather conditions; “Thai food” brought up a list of close Thai restaurants; and “oil change” listed local oil change shops.
This isn’t going to be useful all of the time, obviously — but when driving in an area I am not familiar with, and it is more convenient just to bring the phone up to my ear rather than type in searches in Google Maps, this is going to be incredibly useful.
It’s especially useful for allowing friends and family to search for things who are unfamiliar with the iPhone. Rather than try to explain Google Maps to them while I am driving (whose UI is — unfortunately — unintuitive), I can hand my iPhone to them, tell them to touch the “G” icon in the dock, and then just say what we’re looking for. That’s much easier.
But Arya is right. Google’s voice recognition is so good, it begs to be implemented system-wide. Lack of voice recognition is my biggest complaint, and building this (or Apple’s own voice recognition technology) into the iPhone would immediately make it a better phone.
I like Arya’s suggestion in how it should work, too:
How about this: If you bring the phone up to your ear while you’re on the home screen and a call isn’t in progress, it should immediately activate voice recognition and dial the contact or number you speak into it.