Back in the day (i.e. last year), Fetch was best known for using a crew of humans to respond to messages sent from the app. If you wanted to buy, say, a sweet messenger bag someone was rocking in SoHo, you could snap a photo, send it along, and someone would eventually respond with the cheapest, most appropriate listing they could find. With Expect Labs' voice recognition and analytical chops now being baked into the existing iOS/Apple Watch app, though, those requests can be chopped up and acted on more quickly. The end result? A faster first wave of hits, and a less headache-inducing shopping experience (they hope).
Let's say you're itching to laze under the sun in some far-off locale. You'll be able to ask your Apple Watch to book you on the first flight to Bangkok next Thursday, and Expect Lab's thoughtful back-end will dig up a handful of suitable flight options. From there, those results will get passed along to Fetch's crew of shopping concierges so they can ferret out the best option and send it back home to you for approval. Well, eventually, anyway. Fetch and Mindmeld are talking up a partnership today, but a spokesperson confirmed that the actual functionality won't go live for another few months (hopefully in time for a last minute summer holiday). And if you're one of the countless multitudes who don't -- or won't -- wear an Apple Watch? No worries: The feature will find its way to Fetch's Android app, too, though you'll have to pay Fetch $10 a month on any platform for the privilege.