Is Rhode Island Ready for Self-Driving Vehicles?

The state plans to test-drive autonomous public transit on public roadways next year.
Illustration by Brendan Totten

On November 8, the nation’s first self-driving shuttle bus debuted on Las Vegas’s public roads. The twelve-seat vehicle — with no brake pedals or steering wheel — was programmed to ferry passengers along a half-mile loop through the city’s Innovation District, under a year-long pilot project sponsored by AAA and transit operator Keolis, in partnership with the city.

In less than two hours, it crashed.

A truck backing into an alley from the street hit the shuttle’s front bumper. The electric Autonom vehicle, developed by the French company, Navya, was equipped with eight LIDARs (an acronym for Light Detection and Ranging, a remote sensing system scanning the area around the bus), GPS and front and rear cameras. The Autonom bus sensed the approaching truck, but did not back up to avoid the crash.  

Sometime in 2019, Rhode Island hopes to join a handful of cities in Australia, China, Japan, Sweden, Switzerland and the United States that are test driving autonomous public transit on public roadways. In April, the Rhode Island Transportation Innovation Partnership (TRIP) announced it was soliciting proposals from companies interested in meeting TRIP’s Autonomous Vehicle Mobility Challenge to operate an autonomous shuttle that would link the Valley, Olneyville and Smith Hill neighborhoods that make up the Woonasquatucket Corridor in Providence. In July, the Rhode Island Department of Transportation (RIDOT), which is the lead agency on the project, received proposals from interested companies. It’s aiming to conduct closed-road tests by early next year.

“There’s a careful process, and we’re taking the time to develop the contract,” says Shoshanna Lew, RIDOT’s chief operating officer. “The plans are going to go through an extensive vetting process and there will be training for all of the entities to be involved. Safety is top of mind.”

The Rhode Island Public Transit Authority (RIPTA) sees the pilot as an opportunity to achieve a lot of urban planning goals: increase public transit ridership and reduce the number of cars on the road and the need for parking, while increasing mobility for the physically disabled and its own flexibility to address under-served areas. 

“It’s another tool in our toolkit. That’s why we are at the table and we’re happy to be there, engaged in a cutting-edge discussion,” RIPTA’s then-interim CEO Amy Pettine says. “From the transit perspective, we will use the pilot to get consumer and driver feedback and we’ll be getting intelligence on how it differs from the way we operate today.” 

Engineers have been tinkering with autonomous vehicle technology for nearly 100 years. And if you own a recent-model car, it’s likely that you’ve experienced some level of autonomy first-hand — adaptive cruise control, lane keeping technology, emergency braking — with features that temporarily take control of the vehicle away from the human driver in certain circumstances. (Disclosure: my day job involves auto safety research.) The next levels of automation range from vehicles that mostly self-drive, with the human driver expected to intervene when things go wrong, to those in which humans are just there for the ride. TRIP’s autonomous shuttle will operate with a human driver aboard to act as a failsafe. 

The highest levels of automation have produced mixed results. To date, there have been four fatalities in autonomous vehicles. Three occurred in Tesla passenger cars on autopilot mode. The human drivers had let the Tesla take the wheel, and the Tesla failed to perceive large, looming objects in its path. (In May 2016, forty-year-old Joshua Brown, a Tesla enthusiast, died after his Model S barreled into the side of an eighteen-wheeler crossing a Florida highway.) 

The most recent occurred in March in Arizona, when a retrofitted Volvo being tested by Uber struck a pedestrian. The National Transportation Safety Board, which investigates transportation crashes and uses its findings to recommend policy changes, found that the self-driving Volvo was traveling at forty-three miles per hour on its second test run of the night when it took critical seconds to identify a woman walking her bicycle across the road as a pedestrian. Moments before the impact, the system determined that emergency braking was needed, but Uber had disabled the feature “to reduce the potential for erratic vehicle behavior.” The human driver, apparently, did not have her eyes on the road and did not intervene.

Philip Koopman, a Carnegie Mellon University professor of electrical and computer engineering who is an expert on embedded computer systems, sees trouble ahead. No reasonable amount of road testing can ensure the safety of autonomous vehicles, he says. Safety is better certified by a rigorous development process overseen by third-party independent entities. As it stands, there is no transparency.   

“Right now, we’re taking their word for it that these vehicles are safe, but no one is independently ensuring the engineering process. People will have to decide how much they trust car companies to do the right thing. And if you read the news, there’s a reason people might not want to trust them,” he says. “The other catch is: Many of the people working on these projects are not car people. They are robotics people, who, historically, haven’t had to worry about safety. It’s a very hard problem.” 

According the National Council on State Legislatures, twenty-nine states have enacted laws regarding autonomous vehicles. Governors in another ten states have introduced this technology via executive order. States have been forced to freelance the rules because the National Highway Traffic Safety Administration, which promulgates safety standards, has decided to forego federal regulations in the interest of encouraging this technology. 

“States are taking a variety of approaches,” says Russ Martin, director of government relations for the Governors Highway Safety Association. “There’s a tremendous amount of interest in welcoming and promoting the safe use of technology, but there’s also a high level challenge: How do we regulate this technology, knowing what we know today? State regulators don’t have critical depth of knowledge of how autonomous vehicles work. How do you assess the safety of that vehicle? Some states are taking a more attentive approach; some are taking a more liberal approach.”

California is an example of the former. The California DMV began working on autonomous vehicle regulations in 2012; two years later, autonomous vehicle testing regulations went into effect. The state currently has fifty-three permit holders, which allow testing with an approved human driver. In April 2018, the state took another step forward with a second set of rules for testing completely driverless vehicles for public use. The state is currently working on regulations for commercial vehicles including trucks. As of April, sixty-six collisions involving driverless vehicles had been reported to the DMV. 

“Taking on the regulations for autonomous vehicles has been a learning experience for the department. This is not an area states would normally regulate. It would normally be at the federal level,” writes California DMV spokesperson Jessica Gonzalez in an email. “We have held many workshops and public hearings to get the regulations right. We have met with many states and counties regarding our autonomous vehicle regulations.”

Arizona is an example of the other end of the spectrum. Governor Doug Ducey entered into a secret agreement with Uber to allow the testing of driverless vehicles on public roads with little oversight. In June 2015, he issued an executive order permitting public testing with a human safety driver, and fully driverless pilot programs to take place on university campuses. After the death, Ducey suspended all driverless vehicle testing in Arizona. 

Rhode Island is proceeding without a regulatory framework to guide tests. Lew promises a multidisciplinary research team will examine every aspect of the pilot, although that team hadn’t yet been assembled.  

“We are putting a lot of safeguards in our proposal, including the development of a full safety program with lots of safety valves and multiple checkpoints,” she says.

The Autonom bus crash in downtown Las Vegas generated lots of irony, but the damage was mostly of the public relations variety. No one was hurt. The truck driver got a traffic citation and the Autonom, with some body damage, was sidelined for a day. The pilot program resumed, and the city hopes to continue it when it ends in three months, says Joanna Wadsworth, program manager for Las Vegas’s Information Technologies Department.

“It had a bump in a very low-speed incident,” she says. “And the way we handled it — we didn’t let it slow us down.”

Ellen Liberman is an award-winning  journalist who has commented on politics and reported on government affairs for more than two decades.

Categories: Reporter
Leave a reply