Siri's phone integration is top-notch. I use it on a daily basis to place calls. So why after placing my call do I have to look at my screen and tap to place my call into speaker mode? If I'm using hands-free to place the call, you'd think the phone would continue in a hands-free mode after the Phone app takes over.
Yes, of course, many people use Siri with "Raise to Speak." When this feature is enabled, users trigger Siri by bringing the phone to their ear and waiting for the Siri chimes. But for those of us at stoplights or sitting at our computer, it's often a lot easier to reach to the side and press the Home button instead.
iOS does offer a speakerphone accessibility feature. But it is for incoming calls only. Outgoing calls, such as those placed by Siri, require taps on the speaker button.
That same proximity sensor that detects "Raise to Speak" triggers should be able to sense that my phone, my cheek and, basically, my entire body are nowhere near the screen and automatically start the speaker.
Me? I've filed an enhancement request at Apple's iPhone feedback page. I encourage you to do the same.