Question: Are you concerned about the role of robot cars in the future?
Answer: First of all, I have to confess that this question didn’t come from a reader. It’s a question I asked myself, prompted by an opportunity.
Earlier this week I got a ride in the new Tesla Model 3. In addition to being an all-electric status symbol for the tech generation, it can drive by itself. At least some of the time. It’s what you’d call a level two autonomous car.
As a quick primer, there are five levels of driving automation (six if you include level zero, which as you may guess, offers zero automation). A level two car can take full control of the vehicle, including accelerating, braking and steering, but the driver must pay attention and be ready to intervene when the system needs help.
As someone who prefers to be a driver rather than a passenger because I don’t particularly like to cede my driving safety to another human, I was concerned that I’d be nervous or uncomfortable riding in a semi-autonomous car.
In fact, it was the opposite. The car braked when it felt right, made smooth lane changes and never tailgated. Not once did I feel the urge to press the invisible passenger brake pedal.
Maybe my comfort in the autonomous car was because on the same day that I rode in the car, I was in a meeting where we discussed the fact that 94 percent of all fatal crashes involve human error.
Right now autonomous vehicles have a long way to go before we see one show up at our front door without a driver, or even a steering wheel, but already they can do a lot of things better than a human driver.
While I can only look in one direction at a time, this car has eight cameras covering 360 degrees with up to 250 meters of range. Even on my best day I can’t pay attention at the same level as an autonomous car.
I know not everyone is on board with autonomous cars. It’s still a young technology and there are plenty of questions about safety, security and reliability that will take time to answer.
As a traffic safety nerd, the question I most wonder about is, “Am I safer driving myself or giving control to a computer?” I think the answer depends partly on what kind of driver you already are.
Let’s consider the top three factors in fatal crashes: impaired driving, speeding and distracted driving. Your likelihood of getting in a crash is linked to your willingness to engage in those activities.
In comparison, an autonomous car will never drive drunk or high, shouldn’t be able to speed and won’t text a friend about dinner plans while driving through a red light.
If you’re committed to those same positive driving behaviors, you and the robot car will both be a lot safer than the dangerous drivers.
Where an autonomous car should really out-perform a human is in responding to unexpected hazards.
Humans are bad at identifying hazards that we don’t expect to see. That’s why you hear drivers admit that they didn’t see a motorcyclist, bicyclist or pedestrian. When we’re driving we’re watching out for other cars and get surprised by the things or people that we’re not looking for.
With eight cameras and a 360 degree view, an autonomous car should easily put us to shame. And usually they do.
Unfortunately, there have been some high-profile autonomous car crashes that show that they are not yet infallible either. In those cases, both the car and the human failed to respond to a hazard, with disastrous outcomes.
My take-away from riding in an autonomous car is that while technology can certainly help to make us safer drivers, for the foreseeable future it’s up to all of us as drivers to commit to safe driving practices. We can’t blame the robots, yet; we have to take responsibility for safety on our roads.
Road Rules is a regular column on road laws, safe driving habits and general police practices. Doug Dahl is the Target Zero Manager for the Whatcom County Traffic Safety Task Force. Target Zero is Washington’s vision to reduce traffic fatalities and serious injuries to zero by 2030. For more traffic safety information visit TheWiseDrive.com. Ask a question.