In The News

Robo Trucks

3 Big Hurdles to Robo Trucks Ruling the Roads

By Sean M. Lyden - Staff Writer
Posted Jun 4th 2018 10:17AM

So much has happened in autonomous vehicle development since Google put its first self-driving Prius on the roads in 2008. And industry consensus is that we’ll start seeing highly automated vehicles hit the market legally in 2021—which is only three years from now.

But there are still many questions—beyond the technology’s capabilities—that government, industry, and citizens must answer before robots will rule the roads.

Those questions can be boiled down into these three “hurdles” that may cause the industry and society to tap the brakes on the development and commercialization of fully autonomous trucks on a large scale.

Hurdle #1: Fear
Here’s the deal: Although autonomous vehicles offer the promise of significantly greater safety than their human-driven counterparts, U.S. drivers don’t believe it—at least not yet from an emotional and practical standpoint.

That’s based on the findings in a report from AAA earlier this year, where about two-thirds of U.S. drivers said they would be afraid to ride in a self-driving vehicle.

And nearly half—46% of those drivers—said they would feel less safe sharing the road with fully autonomous vehicles while they drive a regular vehicle.

Yet, when you look at the trend from last year’s number (which was over 75% compared to the 63% this year)—the level of fear appears to be going down.

The industry has taken notice of this fear issue and is working to counteract it.

For example, companies like Waymo, Uber (that is, until the fatal crash in Arizona in March), GM, and Boston-based nuTonomy have launched programs in the past year that offer self-driving rides to select passengers in limited locations around the world. The idea is to get people used to riding in these vehicles and share their experiences with family, friends, and colleagues, with the hopes of not only reducing fear but also increasing market demand for self-driving rides.

Hurdle #2: Ethics
Consider this scenario. A self-driving vehicle is approaching a traffic situation where there will be an unavoidable crash, and it must decide between killing 10 pedestrians or its own passenger.

What would you say would be the right moral choice?

According to an MIT study, 76-percent of participants said that it would be “more moral” for the autonomous vehicle to sacrifice one passenger than kill 10 pedestrians.

This is based on the moral philosophy of utilitarianism where a morally good action is one that helps the greatest number of people—in this case, allowing the vehicle to sacrifice the one passenger to save 10 pedestrians.

But what if you’re the passenger of the self-driving car? Now, that’s a different story.

According to the study, you’re more likely to prefer a vehicle that will protect your life, not sacrifice it.

“It appears that people praise utilitarian, self-sacrificing [autonomous vehicles] and welcome them on the road, without actually wanting to buy one for themselves,” said the report.

This is a prime example of what the researchers call a "social dilemma," where people may have a strong consensus on what's best for society as a whole, but will still prefer to act in their self-interest. And this double standard could have considerable implications regarding impeding the development of regulations that will make autonomous vehicles commercially available.

To encourage more public discussion on this issue on a global scale, one of the study’s authors, launched Moral Machine (http://moralmachine.mit.edu/), an online platform that invites the public to get involved with building a crowd-sourced picture of human opinion on how machines should make decisions when faced with moral dilemmas and discussing potential scenarios of moral consequence.

Hurdle #3: Politics
Last fall, the House of Representatives passed H.R. 3388—the Safely Ensuring Lives Future Development and Research In Vehicle Evolution (or SELF Drive) Act.

There are two key things to note with this bill, which was approved on a bipartisan vote:

● It establishes a national framework for the use of self-driving vehicles
● It defines the roles of the federal and state governments for self-driving cars, including a requirement for the Department of Transportation (DOT) to develop comprehensive regulations for AVs.

And without getting too much in the weeds on this, here are the big takeaways as to why this matters for the development of autonomous vehicles:

1. Congress rarely agrees on anything anymore. The bipartisan support shows cooperation, commitment, and momentum that something will likely get done here.

2. The patchwork of state laws would likely slow down development. So having a national framework would help create regulatory consistency.

3. And that consistency, driven by the guidelines for development, will allow OEMs to build autonomous vehicles for all 50 states—making it more economical to manufacture the vehicles.

This all seems like progress toward a self-driving world. So, where’s the hurdle?

For one, the House’s SELF-DRIVE Act excludes commercial trucks above 10,000 lbs. gross vehicle weight rating (GVWR), such as straight trucks and tractors used in expedited trucking.

The Senate is considering adding commercial trucks for its bill, with Navistar and American Trucking Associations advocating for inclusion because the trucking industry and OEMs want some form of regulatory guidelines before they make huge investments in these systems.

But the Teamsters are lobbying against commercial truck inclusion because of the impact of automation on trucking jobs.

Then you have the issue of recent crashes of vehicles in self-driving or Autopilot mode that have resulted in greater regulatory scrutiny.

And, so, the Senate bill—dubbed the AV Start Act—is stalled for now. There’s still a chance that it might be included as part of the FAA reauthorization bill, with the current bill set to expire on September 30.

And that’s just the stuff that’s going on right now.

You also have long-term political challenges.

Take for example the idea of trying to achieve legislative agreement on ethical frameworks.

Think about how thorny the self-driving issue can be. It hits at the heart of religion, personal values, and moral decision-making.

After all, it's hard enough as humans to make split-second moral decisions in a crisis. But at least we have the power at that moment to choose with our conscience.

But would we, as a society, be o.k. with the idea of being spectators inside machines that make life-and-death decisions on our behalf, without our consent?

Engineers are achieving quantum breakthroughs in artificial intelligence that will function as the “brain” for tomorrow’s fully autonomous vehicle. But will they figure out how to give a machine a soul?

That’s the question—and that’s a big dilemma for Congress, the industry, and us to sort out.