Advertisement

Microsoft's car sensor sees beyond cameras and radar

Stop from hitting folks your car's cameras can't see.

Microsoft has teamed up with automated driving firm, IAV to create a sensor that "sees" what cameras and radar might not pick up. Using Azure and Windows 10, the CHAD (connected highly automated driving) vehicle not only lets drivers access Cortana and Skype, but also locates pedestrians hidden from view based on a connection between their smartphone or Microsoft Band 2 (or you know, other connected wearable) and a connected stop light.

During the demo, the autonomous driving car sped along a closed track and came to a stop for a person wearing a Band 2 that couldn't be seen by regular cameras, other sensors or any of us in the car. Instead, a stop light on the track that's connected to Azure was pinged by the band that it was nearby, that information was then relayed to the car.

OLYMPUS DIGITAL CAMERA

The system is Microsoft heavy using the Azure IoT Suite and Cortana analytics for predictive hazard modeling. It also requires a connected car and a connected infrastructure like stop lights and signs. But as Karsten Schulze, senior vice president of active safety and driver assistance systems for IAV pointed out, it's meant to work with all the other sensors coming to cars with autonomous features. So automakers could use it and other safety features from other vendors.

OLYMPUS DIGITAL CAMERA

Keeping pedestrians safe wasn't the only feature built into the demo Volkswagen. The driver was also able to access the Cortana voice assistant to check on the weather and join a Skype conference while the car drove itself. It even displayed a PowerPoint presentation in the instrument panel. You wouldn't use those now, but the companies are investigating adding more productivity features to vehicles when they become fully autonomous.

So when the driver-less car hits the road, get ready to start working during your commute.