One major issue for autonomous vehicles is driving in fog. Many types of self-driving technology use visible light to determine how to navigate. This becomes a real problem when driving conditions are poor, and especially when there's fog. But now, researchers at MIT have developed a method of producing images of objects within extremely thick fog and judging their distance.
The system uses a time-of-flight camera, which measures distance based on the speed of light. On a clear day, the process is pretty simple. But on a foggy day, the light scatters and is often reflected by water particles within the air, rather than objects a vehicle needs to avoid.
This made any time-of-flight system for autonomous vehicles somewhat less useful in foggy conditions. But the team at MIT realized that statistics could compensate for the reduction in accuracy. Regardless of fog thickness, the light reflections corresponded with a pattern called the gamma distribution. The system they developed can use this information to remove the fog reflections from the equation entirely. What's left is actual objects that need to be avoided.
You can read about the technology more in depth at MIT News, but the bottom line is that this could help self-driving vehicles better master poor weather conditions. Right now, self-driving pilots usually occur in areas with generally good, clear weather. But eventually, these companies will have to start testing their tech in foggy and rainy areas; this may help them to do that.