Hard reality: ‘Open road’ still closed to robotic cars

Article By : Junko Yoshida

The glowing future of autonomous driving is really hard to pull off.

If you think we have spent awful lot of ink on coverage of autonomous cars, you’re right. Colour me guilty.

Just as the general media can’t seem to get enough of Donald Trump, technology and business reporters can’t help being (rather childishly) awed by robotic cars. We often turn ourselves into passive mouthpieces for the technology companies who breathlessly tout the hazy-bright future of driverless cars.

In the last few months, however, we’ve begun to perceive a different reality: That glowing future—autonomous driving—is really hard to pull off.

Autonomous cars’ limitations range from the weather to road constructions (problems that pop up often and unexpectedly, in a million variations) and reading other drivers’ intentions at intersections.

But the biggest problem of all—from the robotic point of view—might be us, humans.

Driver optional?

Tesla’s recent fatal crash brought one critical question back into focus: Should the future of autonomous driving be left to a vehicle completely without a driver, or a vehicle with driver "optional"? The automotive industry and human factor engineers are not close to agreeing on that.

Tesla warns drivers to keep hands on the wheel while on Autopilot. But a driver who gets used to advanced driving assistance will inevitably relax, no longer paying attention to the instrument clusters or even what’s going on directly in front of the car. We’re talking human nature here.

[car 01]
__Figure 1:__ *Tesla Model S instrumental cluster (Source: Tesla)*

The highly autonomous car’s problem isn’t limited to the human driver inside. Once the ADAS car is elevated to Level 5 autonomy, it must learn to interact with all those other humans—often reckless and unpredictable—who share the road. Outlaw bikers. Little old ladies with glaucoma. NASCAR wannabes. Student drivers<img alt="

Technology optimists tell us: "Once we take humans out of the equation, driverless cars will make roads so much safer."

In theory, that’s true. The autonomous car can be designed with no blind spots, maintaining full 360-degree awareness. It will never drive drunk and can’t get sleepy. Its reaction time will put the quickest human to shame.

But real-world autonomous driving is reminding us of an important factor we prefer to overlook: The driverless car can’t live in isolation.

Art of defensive driving

Earlier this year, at Mobileye’s press conference, I heard Amnon Shashua, Mobileye co-founder and CTO, pointing out why we have driving schools. The student driver takes lessons not because he or she needs to learn where the engine is or how to fix it. The lessons teach traffic rules, and most important, other people’s driving behaviour.

If we follow Shashua’s logic, a self-driving car must learn not just how to instantly classify every object up ahead, but how to drive smoothly among other—spectacularly unpredictable—drivers on the road.

In short, driverless cars need to learn the art of defensive driving.

You might also recall a Google car accident earlier this year, in which the autonomous car hit a bus on El Camino Real. This is a classic example of an autonomous car making wrong assumptions about bus-driver etiquette.

[car 02]" src="https://images.contentful.com/rphcxwngga0b/4178EmIOxOS28cMiai0m0s/fac4350c038cf1b2513fe098a301fdf3/EETI_car2.jpg" />
Figure 2: Google car after the collision with a bus

The Google car—and its test driver—saw the bus approaching in the left rearview mirror, and assumed it would stop or slow to allow the Google [car] to continue. But bus drivers, as most of us learn the hard way, ask no quarter and give no quarter. They’re driving the biggest thing on the road, and it’s on somebody else’s insurance. The bus, of course, neither slowed nor gave ground.

Crunch<img alt="

Fortunately, no injuries were reported.

As Google explained later, this can be described as “a classic example of the negotiation that’s a normal part of driving.” Both parties, trying to predict each other’s move, guessed wrong. But how should a robotic car negotiate the road with humans?

Socially acceptable robotic cars

Meet Maarten Sierhuis, a former NASA research scientist and now the director of Nissan’s Research Centre in Silicon Valley. He talked about “Socially acceptable AI-based city driving,” during his “Design for Auto” presentation at Semicon West in San Francisco last week.

Sierhuis listed a number of challenging situations driverless car need to handle in city driving. Road constructions, for example, present a huge headache. “You’d be surprised to know how often road construction happens—essentially all the time—and that construction workers rearrange the placement of cones often,” said Sierhuis.

Other biggies for the autonomous car include intersections, right turn on red, four-way stops, and what’s described as a “monster intersection” in Sunnyvale, Calif.

[car 03]" src="https://images.contentful.com/rphcxwngga0b/4tON2xYfHieAMG0aiWqUEU/b823c5159e0d2806475d46a34c111d0c/EETI_car3.jpg" />
Figure 3: Mathilda 'Monster intersection' in Sunnyvale, Calif.

Sierhuis’ team has been working on technologies to “broadcast the autonomous car’s intention” to others—pedestrians, cyclists and other drivers—at an intersection. “We can’t tell others how to behave, but we can let them know what the autonomous car intends to do next,” said Sierhuis.

‘After you’

Last fall at the Tokyo Motor Show, Nisssan demonstrated an “interaction indicator” that changes colour from blue to green, yellow, orange, and red, depending on the “alert level.” Nissan’s autonomous car also showed a large text readout at the base of the windshield, facing outward, that tells people the autonomous car’s intent, such as “after you.”

Nissan’s autonomous “after you” readout might be well received in Japan, where being polite is a cherished virtue. In Boston, not so much.

“Normal” driving behaviour varies vastly from city to city, country to country or culture to culture.

In some cities, it’s rude not to let a pedestrian cross if you’re already coming to a stop, for example. In other cities, giving in to pedestrians can be seen as interrupting precious traffic flow.

Nissan’s Sierhuis agreed. He told us, “Even between Mountain View and Stanford Univ., we find people’s behaviour on the road different.”

Referring to the video clip (“rush hour traffic in a busy intersection”) shown by Audi’s Berthold Hellenthal on the same panel, Nissan’s Sierhuis said, “People in Teheran drive exactly like that.”

Understanding human behaviour is hard enough for the robotic car. Throw in ethnographic factors and we’re talking about programming the self-driving car that must behave differently depending on where it is driving.

Skill decay

The human interface with machines is complicated. We’ve seen many examples of trained pilots falling prey to boredom and distraction as a result of flying on autopilot. Worried about potential “skill decay” among pilots using autopilot, the FAA earlier released a safety notice recommending that pilots fly more often in manual mode.

Earlier this year, I had a chance to chat with Richard Hartman, a retired commercial and military pilot. Our conversation centred on differences between flying and driving on autopilot.

Hartman pointed out that pilots usually spend a lot of time on flight plans before taking off. Once the plan is loaded into a computer, another pilot doublechecks the information exhaustively. The pilot reads it aloud and the other confirms it. Then, the air traffic control system sets up the course. If rough weather appears on route, controllers might alter the course away from the more direct path charted by GPS.

In contrast, a human driving a car, Hartman said, is never so orderly. Even with an ultimate destination—like home—the driver might change his or her mind to stop for coffee, or remember the need to pick up groceries on the way.

Hartman asked, “Do you think people will spend so much time in pre-planning and doing data entry to chart their course?”

A far more most likely scenario in the real world is that people will, more often than not, “disengage the autopilot mode,” he predicted.

Pranksters

And then, there will pranksters. Cyberspace is crawling with saboteurs, eager for the opportunity to fool a robotic car.

Even if it’s mere mischief (rather than terrorism) to confuse the robot on the road, the result can be a fatal accident.

Asked during the Q&A session about interruptions potentially triggered by people’s misbehaviour, Sierhuis reluctantly acknowledged the possibility. If it continues, “[level 5] fully autonomous cars might never happen,” he said.

Rather than getting ahead of ourselves, it’s time to get more realistic about the timeline for fully autonomous cars.

Level 5 fully autonomous cars may emerge in 2020 or 2021, as many car OEMs and tech suppliers have promised. But they will come with big restrictions. They can only drive “in a geo-fenced area”—or more accurately, a pre-defined route.

In short, the hallowed “open road” will still be the province of outlaw bikers, little old ladies with glaucoma, NASCAR wannabes and student drivers.

Leave a comment