gesture controls. A few more pages from that expanding tome were just made public, and the concepts unearthed are certainly thought-provoking. The first involves using a proximity sensor in addition to the touch panel to register gestures in 3D. For example, you could use three fingers to mark out the corners of a triangle on the screen and then "pull up" and pinch to create a pyramid for use in a CAD application. The second idea involves gestures based on intuitive "physics metaphors" that are recognized using motion sensors. So instead of navigating menus in order to start a file transfer between an iPhone and iPad, the user could arrange the desired files on the phone's screen and then pretend to "pour" them onto the tablet -- an idea which rather reminds of the funky Project Blox. Oh, did we just make a Cupertino lawyer twitch?