Blogs

Why Self-Driving Cars and ADAS Are Not Safe (Yet)

self-driving car algolux

Why are darkness and bad weather considered “corner cases” by the autonomous vehicle industry?

Self driving cars are not safe yet as exemplified by this bicycle accident in the rain

A close call during the rain this weekend made me want to share a personal story of the need to improve the ability for cars to see better and reminded me why self-driving cars and ADAS are not safe yet.

It was lightly raining and getting dark while I was driving to get a few last-minute items for a party. As I neared an unprotected intersection, a bike rider turned wide onto the road. While I saw him a bit late, I swerved and slowed down in time, so no real danger. But it was near where my son was actually hit by a car while biking home from school a few months ago in the rain.

He luckily escaped with only a bad wrist sprain and various scrapes and bruises, but it could have been much worse. The driver had stopped her car, a quite new Japanese luxury sedan, by the side of the road to pick up her grandchild from school and pulled out not seeing my son. There were no obstructions behind the car and neither the driver nor the Autonomous Emergency Braking (AEB) system in that car model (if it had that option purchased) saw him to stop the car in time.

Road accidents will be significantly reduced with more robust and accurate computer vision

Needless to say, this happens a LOT. In the US alone, roughly 45,000 cyclists are involved in road accidents per year. Improving computer vision accuracy of the Advanced Driver Assist Systems (ADAS) like the above AEB system is a fundamental and urgent requirement, one that automotive OEMs are working on. But these systems still struggle to deal with the everyday use cases… driving at night, in the rain or fog, dealing with snow and dirty sensors. This will become a bigger problem as more self-driving autonomy is being introduced into vehicles.

”As an optical system, Lidar and camera are susceptible to adverse weather and its performance usually degrades significantly with increasing levels of adversity.” This quote is from Andreas Haja’s comprehensive article on Autonomous Driving. It outlines various ADAS and autonomous vehicle computer vision and perception components and architectures with a focus on multi-sensor fusion.

“Cameras are good at object detection but struggle with low light. Lidar works in every lighting condition but suffers in scenarios like snow.” is a takeaway from Aparna Narayanan’s recent article on how soon (or far away!) self-driving cars are.

I wanted to reference these posts as they highlight some key takeaways.

Most traditional automotive OEMs and next-gen mobility providers, with the exception of Tesla, see achieving NHTSA Level 4 or Level 5 autonomy as a set of pragmatic steps, building off of advancements in ADAS and progressively increasing autonomous capabilities and use cases as the technology evolves.

This is being done by restricting the use cases into manageable “bites” to introduce new ADAS and autonomous functions. We see today’s ADAS capabilities being active only in clear highway driving conditions, for self-parking applications, and in back-up scenarios.

Autonomous vehicle and autopilot testing are limited to highly mapped geofenced areas, during clear times during the day. The consensus is they have a long way to go before being as capable as a human driver. As such, leaders at GM, Waymo, Uber, Cruise, Aurora, and others are hedging on the timeframe for practical fully autonomous deployment.

Addressing this needs to start with more robust and accurate approaches to autonomous vision and the industry can’t treat darkness and bad weather as a “corner case” problem.

The good thing is many are focusing on advancing the state of the art and I’m excited to be part of a team that has proven we can see in the dark, look through rain and other harsh conditions, and massively improve these systems for safer driven and self-driving vehicles vs. the best the industry has today. 

Here are some examples showing much better than state-of-the-art detection in very difficult scenarios.

Self-driving cars and ADAS are not safe yet, but with robust perception technology for all conditions, they will get one step closer.

Subscribe to our newsletter and get the latest on computer vision news