Make us your home page

Column: We expect perfection from self-driving cars, but not from human drivers

Joshua Brown died letting his Tesla do the driving for him in Williston. Just a few days earlier I was test-driving the same self-driving system in another Tesla for a column.

Both of us were playing a game of chicken with technology, maybe not fully appreciating the gamble we were taking.

A lot of people are unnerved by the idea of vehicles on autopilot, but we are destined for a future in which most of our cars drive themselves. But since we aren't there yet, I tried something closer to what I'd call half-driving. I turned my life over to the Autopilot setting on a $100,000 Tesla Model S, which let the electric car drive itself, sort of.

It doesn't respond to stop signs or red lights. It doesn't drive a route, turning down one street and then on to another on its own. What it does do, at least on a highway, still seems incredible.

Picture going 70 mph on the highway when the road curves, the cars ahead speeding up and slowing down, vehicles on either side of you and — this is the crucial part — you are admonished to avoid braking, accelerating or steering the Tesla, because the car does all of that for you.

"You don't want to override anything," the Tesla PR person told me.

What worried me was that she also cautioned me to keep my hands on the wheel, if only lightly, and to remain vigilant. I inferred this was because, well, it would be crazy to trust a car to drive itself.

How do I know when I should and shouldn't take control? I asked. "I don't have that answer," she replied. Very comforting.

The first 20 minutes of driving in Autopilot is an exercise in trust and willpower.

My body tensed. I definitely winced. I may have grunted. Or yelped. My foot had a mind of its own and hovered just above the brake pedal. Twice my instincts kicked in and I took control.

Other drivers had no idea there wasn't a person driving the fancy car beside them. The Tesla stays in its lane by using cameras, radar and sensors to watch lane lines or, if they are faded, the vehicle in front of it. The car slows itself or stops as needed, then speeds up again. It steers through curves.

Flip on a turn signal, and the Tesla will look for the right time to change lanes, then cross over when it sees fit.

Which is where the car and I had a disagreement.

I'm not good about delegating to a machine. The Tesla drives like it wants to drive, which odds are isn't exactly the way you would. I became a bit of a front-seat driver. Once or twice the Tesla was a wimp, slowing down when it saw something minor ahead that I knew was nothing meaningful.

The real test was when I purposely hit the turn signal when there wasn't much of a gap between cars in the next lane over.

The car waited a beat, then moved like it was going to get over. To me it felt like it was going to cut off another driver. My nerves turned from steel to cooked spaghetti. I tapped the brake and stopped it from switching lanes.

"That's what you don't want to do," Tesla's backseat PR person told me. The Tesla would have sped up and gotten over safely, she said.

Driving is a complex, messy exercise. It takes mountains of trust — or self-delusion — to believe a machine will get it right and not kill us. The same could have been said back in the early days of planes or regular cars.

We have fresh soul-searching and cold analysis to do now. The National Highway Traffic Safety Administration, which recently announced Brown's accident, is investigating Tesla's Autopilot system. Tesla said the death is the first in more than 130 million miles of driving with the activated Autopilot.

In Brown's case, his Tesla didn't automatically brake for an 18-wheeler because it didn't pick up the trailer's white side against a brightly lit sky, according to Tesla and the government.

Of course, there are lots of white trucks on highways and lots of days when the sun lights the sky.

We have less tolerance for machines that kill us than we do when our fellow, foolish human drivers do the same. To err is human. To be perfect is what we demand of machines designed to act like humans.

I have to believe that technology can solve a white-truck/sunlight problem. I hope it also can handle the host of other variables and nuances involved in driving. Still, when it comes to testing the bounds of technology's surging role in our lives, we are perennially the guinea pigs.

Matt Kempner writes for the Atlanta Journal-Constitution. © 2016 Cox Newspapers

Column: We expect perfection from self-driving cars, but not from human drivers 07/05/16 [Last modified: Tuesday, July 5, 2016 5:19pm]
Photo reprints | Article reprints

© 2017 Tampa Bay Times


Join the discussion: Click to view comments, add yours