Apple has included limited voice control functionality in the iPhone since the debut of the iPhone 3GS, but the company's plans for voice control in iOS 5 are reportedly far broader in application according to 9to5 Mac. While no such feature was announced at WWDC and it has yet to show up in the iOS 5 betas, 9to5's sources claim Apple's test units are already testing out an "Assistant" feature meant to introduce widespread speech-to-text functionality in iOS 5.
Those of you who have used Siri or the recently-launched Dragon Go! know how powerful third parties have been able to leverage speech-to-text in iOS already. Apple purchased Siri outright in 2010 and has reportedly been working with Dragon's parent company, Nuance, in further expanding the speech-to-text functions of iOS 5. Apple's partnership with Nuance has even extended to OS X, with several of Lion's optional text-to-speech voices coming directly from Nuance's stable of high-quality voices (and we'll provide you with an overview of those new voices in an upcoming post -- spoiler alert, they are awesome).
9to5 notes that since these new features have been missing from developer betas of iOS 5, it may imply that the new voice navigation "Assistant" feature may be an iPhone 5 exclusive. However, it's equally probable that since the feature has only just entered testing within Apple, Apple is not yet ready for developers to begin prodding at it in the betas. Given how well Siri and Dragon Search function already, there doesn't appear to be any technical reason why the iPhone 4 or iPad 2 wouldn't be able to run the "Assistant" feature as it's been described.
According to 9to5's additional findings, a "start" button on the systemwide keyboard will initiate the speech-to-text function with a popover microphone screen. Similarly to how Dragon Dictation works, the system may then translate that speech into text at the user's option.
The implications for systemwide speech-to-text functions in iOS 5 are fairly extraordinary. Siri and Dragon Go! have already shown how naturally spoken language can be leveraged in web searches, and if the same function can be expanded to the rest of the iPhone's functions, it may be possible to navigate the iPhone's functions entirely by voice. Apple has already filed a patent including such features, so this is far more than pie-in-the-sky musing on our part.
This obviously wouldn't be an out-and-out replacement for the traditional touchscreen interface, but as a supplement to the touchscreen, a voice nav system would be a very powerful tool. I'm already picturing a day when I can ask my iPhone for directions to the nearest petrol station without having to pull off to the side of the road and fiddle with the screen first. Hopefully the "Assistant" feature will be a bit smarter than VoiceOver when navigating my music library by voice, so I'll have less instances of my iPhone translating "Play album: Kid A" into "Calling 555-8888."
Although 9to5 has found some powerful evidence that Apple's working on integrating this feature, there's no telling when it will actually debut. If it's ready in time, I wouldn't be surprised to see it as one of the marquee features of the rumored September iPhone event.