To listen to a growing, ever-more-vocal crowd, driverless cars are in America’s very near future. Commentators have taken the media to task for giving short shrift to the new technology, with some going as far as to declare 2017 “the year of the driverless car” (Or was that 2016? 2015?). Ars Technica senior reporter Timothy Lee is confident that “driverless car adoption will only accelerate in 2018,” pointing to tests in Phoenix thanks to “permissive” regulations.
Critics of government intervention paint driverless car rules as just another front in the battle against heavy-handed regulations. At first glance, this certainly appears to be the case. Contrast Arizona Gov. Doug Ducey’s permissive approach to autonomous vehicle testing with California’s long list of specifications about vehicle readiness and company-customer communication. But regulations pertaining to “roadworthiness,” or the suitability of a vehicle for the road, require a different conceptualization than most other government rules.
To see why, consider the following thought experiment: instead of allowing driverless cars on Arizona roads, Ducey allows any citizen to drive without passing a road test. This shouldn’t and wouldn’t be viewed as a “deregulatory” action, since roads are almost entirely public and officials establish countless rules to ensure they are used safely and effectively. If the road system were sold to private entities and state rules didn’t apply, companies would likely enforce a variety of competency tests of their own choosing.
Yet most administrators, public or private, would not in their right mind allow people to take to the roads regardless of experience and knowledge of the rules. Allowing pre-teens (or adults with bad driving records) to get behind the steering wheel would likely increase accidents and unfairly impede motorists, leaving large productivity losses and bloodthirsty commuters. Decreased safety would also lead to more taxpayer dollars devoted to accident response and maintenance.
Are Driverless Cars Really Safer than Human Drivers?
But driverless cars are nothing like 12-year-olds behind the wheel, proponents insist. Robots are far more trustworthy drivers than their human counterparts, making it in the public interest to put driverless cars in the fast-lane toward public use. This claim is taken as gospel, yet precious little evidence is put forward to support that premise.
Existing safety studies of partially autonomous vehicles, like the Tesla Autopilot, come up short in showing concrete safety benefits. Tesla received a public relations boost earlier this year when the National Highway Transportation Safety Administration found the Autopilot feature lowered crashes by nearly 40 percent. While such a large finding should have established the credibility of quasi-autonomous vehicle features, flaws in the study cast doubt on government claims.
The report failed to take into account the introduction of automatic braking several months before the introduction of autopilot. A 2016 study from the Insurance Institute for Highway Safety examining the impact of automatic brake features found that a 39 percent reduction of rear-end crash rates based on that feature alone. If IIHS findings are accurate, nearly the entire decline in Tesla crash rates can be attributed to automatic braking, instead of the automatic steering touted as a prelude to full autonomy.
Any lack of a safety boost in the data is just explained away as the vehicle not being autonomous enough.
In fact, the fatal Tesla Autopilot crash in 2015 involved a driver who allegedly ignored the safety warnings his vehicle issued. In a similar vein, Waymo—Google’s self-driving car initiative—used footage showing asleep “drivers” in semi-autonomous vehicles to argue that anything short of full autonomy was prone to human inattentiveness. For about two months now, the company has ditched the human driver altogether in favor of an (allegedly) completely autonomous experience.
But this transition from assisted to unassisted driving is the very opposite of the human driving test we all know and love. During the “mostly-autonomous” phase, Waymo vehicles proved unable to make left turns, and proceeded slower than the flow of traffic. Other motorists are supposed to take Waymo’s claims of continuous improvement at face value, without any sort of oversight.
Driverless Cars Should Be Held to Equal Safety Standards
Even the “heavy handed” California regulations don’t require carmakers to supply data attesting to vehicle safety before putting their driverless cars on public roads. And company assurances of safety ring hollow given the fishy actions of top executives. Google quietly stopped publishing monthly accident data in December of 2016. This is strange considering the company’s insistence that safety continues to improve. Waymo executives also purportedly refuse to run public demonstrations on the freeway, instead sticking to exceptionally safe, clearly-marked roads in a quiet part of Phoenix.
If a driving test-taker proved unable to make certain turns, go a normal speed, and traverse busier roads, he would be asked to undergo additional instruction and retest. He would be laughed at for insisting that a license be granted regardless of such failures. Creating a lighter standard for “autonomous vehicles” amounts to governmental favoritism over an unproven industry.
If companies like Waymo and Uber want to prove that their vehicles can handle a wide variety of conditions and situations, they can do so without using motorists and pedestrians as guinea pigs. These multibillion behemoths can invest in elaborate tracks that recreate hazardous real-world conditions. Buying large tracts of land in hilly areas, building test roads, and hiring “extras” to serve as pedestrians could go a long way toward transparently proving efficacy.
But instead, a combination of cronyism, secrecy, and taxpayer-funded testing and accident response are creating a large double standard. Over the coming decades, autonomous vehicles may well bring large benefits to public safety and transform the way we live. But along the way, unforeseen costs and pitfalls must be addressed to ensure that taxpayers aren’t bilked for a technology that doesn’t live up to hype.