To hear Danielsen explain it, the phone's speaker can act "like the mouth of a bat" and emit sound at ultrasound frequencies (in this case, between 23kHz and around 35kHz). That would make the phone's microphone the equivalent of the bat's ears, listening for how our faces or hands or whatever distort that inaudible sound. That job of interpreting that shifting soundscape falls to the software — they call it "Beauty" — which determines when an appendage is too close and causes the screen to shut off. Voilà, the behavior you expect with one less bit of hardware involved.
If the name "Elliptic Labs" sounds familiar, it's because the company's been trying to squeeze its ultrasound tech into smartphones and laptops for years. Most recently, it trekked to Barcelona for Mobile World Congress last year to show off how quickly smartphones with the right emitting and receiving hardware can respond to hand gestures up to 2 meters away. (The answer, as evidenced by this video, is super fast.) This time though, Elliptic's mission was to subtract hardware from a smartphone, not add to it.
You probably shouldn't expect your next phone to get dramatically smaller as a result; the bits that make up a proximity sensor aren't huge to start with. Danielsen says they're usually only about 2mm x 2mm. Smartphones are pieced together like jigsaw puzzles though, and even freeing up that much space could lead to a reduction in size or the addition of something more useful. And more importantly, the lack of additional components means it's easier to squeeze this tech into new devices at scale. That's a big deal for a small startup, and less work for the smartphone makers looking for a different way to do things. Danielsen says multiple (sadly unnamed) OEMs are considering running with Elliptic's approach in their 2016 devices, but we'll soon see how many of them really follow through.