AUSTIN (KXAN) — Austinites could soon hail driverless taxis as General Motors’ Cruise rolls out “robotaxi” technology in Austin.
Cruise’s CEO Kyle Vogt announced in a Sept. 12 tweet the technology would expand its robotaxi services footprint, launching in both Austin and Phoenix in approximately 90 days. Cruise is a San Francisco startup acquired by GM roughly six years ago, according to reporting from the Associated Press.
And the Texas capital has seen growth in its autonomous vehicle landscape in recent years.
In May, Argo AI launched a driverless car fleet in partnership with Ford, Lyft, Walmart and Volkswagen. Argo AI established its Austin fleet with 20 vehicles being used in pilot programs with Lyft for rideshare services and Walmart for grocery deliveries. While driverless, two testing specialists sit in the front of the vehicle to ensure they’re functioning properly.
Robot delivery service Coco also launched in Austin this year and has partnered with dozens of local businesses in Austin to drop off takeout food with a robot.
Peter Stone serves as a computer science professor at the University of Texas at Austin as well as the robotics director for Texas Robotics. He said Austin’s business and startup-friendly environment has proved beneficial in attracting autonomous vehicle technology to the region.
“It’s been really exciting to see more companies coming to Austin and seeing Central Texas as a great proving ground for autonomous vehicles,” he said. “Austin has always been considered one of the best places to try to roll out new technologies like this, and it’s great to see companies targeting us.”
When it comes to driverless vehicles, Stone said they operate like an autonomous robot. There are three key ingredients needed for the technology to be safe and successful: perception, cognition and execution.
When building autonomous vehicles, Stone said the technology needs to be able to perceive and identify its environment: other cars, pedestrians and road lanes. From there, it needs to be able to take that input and make decisions on how to turn the steering wheel, when to accelerate and when to hit the brakes.
The final step, execution, is marrying perception and cognition together to safely drive the vehicle through traffic at an appropriate speed and with its surrounding environment in mind.
These vehicles can utilize GPS technology to map its routes, but other features — cameras, laser range finders and other perceptual features — will pick up on what’s in their surroundings and how to proceed.
“On a road, there’s a lot of structure. There are lanes, there are intersections, there’s traffic signals, there’s stop signs,” Stone explained. “So it’s a task that is quite feasible to automate, and people have been working on [the technology] for quite a long time.”
With safety being one of the biggest questions surrounding autonomous technology, Stone said these vehicles often undergo thousands of miles of simulated road tests to react to environmental features. One of the questions he said cities like Austin need to factor in are potential override features within the vehicle that a rider can access in the event of a malfunction.
Stone added some sort of autonomous vehicle identifier would also be beneficial so mixed traffic — or cars with drivers and those without — can safely drive together and anticipate any autonomous vehicle decisions being made.
“I think people will start over time to get used to the kinds of decisions that autonomous cars make, and they may be different than the kinds of decisions that people make,” Stone said.