This article is more than 1 year old

Fog off! No more misty eyes for self-driving cars, declare MIT boffins

Auto autos prevented from being blinded by the elements – using the power of statistics

MIT brainiacs have come up with some new fangled technology that could help self-driving cars cope with misty mornings.

Adverse weather conditions such as rain or fog has long bedeviled autonomous vehicles, with studies showing that today's robo-rides will, at best, pass control back to their human masters when visibility drops.

Led by graduate student Guy Satat, the MIT team sought to make the eyeballs of navigation systems as good as, or better than, those of people in conditions of dense fog.

Computer-controlled cars these days typically use LiDAR to feel out their surroundings, by firing wave after wave of laser light and measuring the time it takes reflections to return. These pings allow the onboard software to build up a 3D model of people, vehicles, and other objects, around the car in real-time.

Blinded me with science

Researchers blind autonomous cars by tricking LIDAR

READ MORE

On a clear day, this works well. However, on a foggy day, the laser light is scattered by water droplets in the air, giving a false impression of the surroundings. This is not ideal for an autonomous car seeking to avoid to obstacles like other vehicles, pedestrians, cyclists, trees, or the edge of the road.

The timings of the fog-distorted blips of reflected radar light follows a gamma distribution, though. That's quite handy because these distributions are defined by two variables, and thus software connected to the LiDAR can estimate these values on-the-fly to produce a distribution that compensates for the noise caused by the fog. Ultimately, the system has a clearer perception of the surrounding environment by filtering out the effects of the fog using this statistical analysis.

In experiments, Satat set up a metre-long fog chamber, and the system was able to discern the shapes of wooden blocks, a human figurine, and even letters that could not be discerned by the human eye, through the chamber.

“What’s nice about this is that it’s pretty simple,” Satat said. “If you look at the computation and the method, it’s surprisingly not complex. We also don’t need any prior knowledge about the fog and its density, which helps it to work in a wide range of fog conditions.”

Tesla, at least, may not be interested. In a February earnings call with Wall Street analysts, Elon Musk, the boss of the electric automaker, said of LiDAR: “in my view, it is a crutch” before suggesting that combining image recognition with radar would be a better option. The Culver City Fire Department, over in California, USA, will be hoping Musk makes good on that promise. ®

In related news

Arizona cops released footage on Wednesday evening taken from inside the Uber self-driving car that hit a woman as she walked her bike across a street at night over the weekend. She died from her injuries soon after. The video is fairly disturbing as it runs to the split second before the crash. Investigations continue.

More about

TIP US OFF

Send us news


Other stories you might like