How Will Self Driving Cars Handle Inclement Weather?

Although self driving cars are all the rage these days, one of the biggest limitations of the technology is how the systems are unable to drive on the roads during periods of inclement weather.

The main reason for this is that sometimes the cameras which pick up road dividers and obstacles often can interpret snow and rain as obstructions on the road.

Before going into this challenge specifically, it’s worth noting an even higher level challenge – how can cars think like humans?

As Gizmodo discusses in their article, there’s a world of difference between programming logic and the information humans process as they’re driving down crowded roads.

While it’s possible to program cars to drive based on logic straight from the DMV playbook, that’s only one percent of the things which drivers face on roads today. Breaking strength, turning speed, etc. – these are all highly subjective and vary on a case by case basis.

According to Gizmodo, the three basic skills required for a car to drive itself are:  understanding its location, figuring out the safest route to its target location, and it must know how to get to the next spot.

If you’ve used autonomous parking technology, then you’ve already seen the third stage – movement – in action. Compared to the other two stages, movement is relatively trivial to program.

Going back to the issue of inclimate weather, headway is being made to some degree with the first two stages.

While it’s unlikely all the previously mentioned challenges will be overcome anytime soon, an article from Quartz mentions that Ford has been testing driverless cars on the snowy streets of Michigan since January. The company now claims that their vehicles can “see” raindrops and snowflakes thanks to an algorithm developed by the University of Michigan.

Below is an advertisement showing the technology in action:

So how does the technology work?

Ford’s autonomous vehicles rely on LiDar sensors which emit short burst of lasers as the car moves down the road. Rather than relying on a single beam, the car pieces each laser burst together in order to assemble a 3D map of the environment.

By analyzing the burst and echos of objects, the cars are able to determine if they’re hitting raindrops or snowflakes.

Although it’s a significant advancement in the technology, Jim McBride, technical leader for autonomous vehicles at Ford points out a significant limitation with the system:

“Let’s say I could drive at 80 miles per hour in the fog because I had some magical sensor that could see through it,” he told Quartz. “That doesn’t mean the other cars on the road can see through it … You have to mix in with the rest of the world. If you engage a system like that, you become an invisible vehicle to the rest of the people.”

Does this mean driverless cars will never be practical for drivers in snowy regions?

Not necessarily.

In their discussion on this topic Bloomberg mentions that the solution possibly lies in highly advanced artificial intelligence systems (some of which can perform as many as 24 trillion “deep learning operations” per second).

Deep learning means that the cars will learn how to behave based on millions of miles of driving experience. This means that rather than trying to follow lane guides on snowy roads, it instead will follow ruts on the road.

With AI technology, vehicles will be able to make their decisions in real-time based on experience. Still, the technology isn’t going to enable the cars to overcome extreme conditions such as a whiteout.

Leave a Reply

Your email address will not be published. Required fields are marked *