A new patent by Apple reveals that the company has been working on gesture-based commands determined with audio transducers on the corners of a given surface -- in other words, as your fingers press and tap on a surface (like a keyboard or a computer casing or even the bezel around an iPad), the audio receivers would determine where and how you touched it, driving a user interface. The patent outlines a few different ways this could be done, from listening in to the housing itself or just keeping an electronic ear out for the sounds of touching the surface.
It seems like this would all be done via interaction with the surface itself, though we've seen interaction done with ultrasound, where a set of receivers actually determine movement and placement in the air. Apple has also included the apparatus as an add-on to a laptop screen, so they're playing around with it in a few different ways. It's hard to see how you'd do multi-touch with a setup like this, but of course this is just research rather than application.
Of course, as we always say on these patents, this is just research at this point, and it's unlikely we'll see this out as an actual product any time soon. In my estimation, it seems like this is something Apple was considering before it went with the capacitive surface on the iPad -- now that Apple has bet big on those components, it's unlikely that touch interfaces would be switched in the middle of a product's life.