Designer Cody Sanfilippo put together this concept video of how Spotlight on iOS could work, and it's quite impressive. The current implementation of Spotlight for iOS (accessed by swiping left on your iOS device's homescreen) is fairly useless; while the search is very thorough, the list of items you get is not very helpful, and you generally end up seaching through the list almost as long as it would have taken you to just go find whatever you're looking for in your actual iPhone.
But Sanfilippo's concept wisely separates the found items out into a filtered window, and then crams as much information about each found item as possible there. There's even a solid bit of functionality: You can not only access apps and other information directly from Sanfilippo's Spotlight screen, but do things like install apps or send tweets out from the same screen.
Unfortunately for Sanfilippo, I don't think Apple is as interested in building out Spotlight in this way as it is in building out, say, Siri. Lots of these functions are already accessible in Apple's voice assistant and more are coming all the time. Plus, Siri provides a nicely patentable "black box" implementation -- users not familiar with all of these various functions don't need to get confused by seeing more info and options than they need. Of course, for those of us who could make use of a setup like this, it seems great. But it seems unlikely that Apple will put this much work into making Spotlight that much more useful, when it has other options to work with already.