Meet the Woman Making Uber's Self-Driving Cars Smarter, Cheaper

Raquel Urtasun, head of Uber's autonomous-vehicle lab, believes cameras can replace expensive lidar in steering cars.
Image may contain Car Transportation Vehicle Automobile Human Person Clothing Jacket Coat Apparel Tire and Wheel
Uber Advanced Technologies Group

Next month in San Francisco, Uber will stand trial in federal court for allegedly cheating in the race to commercialize self-driving cars. Google parent Alphabet accuses Uber of stealing designs for sensors called lidars that give a vehicle a 3-D view of its surroundings, an “unjust enrichment” it says will take $1.8 billion to heal. Meanwhile in Toronto, Uber has a growing artificial-intelligence lab led by a woman who’s spent years trying to make lidar technology less important.

Raquel Urtasun joined Uber to set up a new autonomous-vehicle research lab in May—almost three months after Alphabet filed suit. She still works one day a week in her old job as an associate professor at the University of Toronto. And she has long argued that that self-driving vehicles can’t reach the masses unless the industry weans itself off lidar.

Most autonomous vehicles in testing—including Uber’s—pack one or more lidar sensors. But each lidar device costs from several thousand, to several tens of thousands of dollars. Urtasun has shown that in some cases vehicles can obtain similar 3-D data about the world from ordinary cameras, which are much cheaper.

“If you want to build a reliable self-driving car right now we should be using all possible sensors,” Urtasun says. “Longer term the question is how can we build a fleet of self-driving cars that are not expensive.”

Even reducing the number, or quality, of lidar sensors a vehicle needs to drive safely could shift the economics of autonomous cars. It might also help a company with legal troubles that make developing in-house lidar technology difficult.

Urtasun showed off results of her efforts to have cameras substitute for lidar at a computer-vision conference in New York a few weeks after joining Uber. They were enabled by recent advances in algorithms that learn to process images. Videos showed 3-D views of streets in Karlsruhe, Germany, extracted from stereo images from ordinary cameras. Urtasun said the system could run in real time, and compete with lidar within 40 meters of the car. That's a shorter range than high-end lidar sensors, suggesting that cameras can't yet do everything lidar can.

Self-driving-car projects also use lidar to gather and update the high-resolution maps autonomous vehicles need to navigate. Urtasun calls the cost and time involved a “fundamental issue” preventing widespread use of self-driving cars. Developing more scalable approaches to mapping is now one strand of her research at Uber.

Urtasun’s previous work has shown that smart-camera software might help with the mapping problem, too. Her University of Toronto lab developed software that could generate maps of roads, parking lanes, sidewalks and other features from aerial and ground-level photos. Another project showed how cars might observe the position of the sun to determine their location without GPS. Eight of her grad students joined Uber with her; the group now numbers about 30, and is still hiring.

Urtasun’s prominence at Uber reflects a relatively new school of thought in the world of self-driving cars. The rush to commercialize the technology was catalyzed by a series of contests organized by the Pentagon in the mid-aughts. The community that formed was and still is dominated by roboticists, who tend to focus on developing reliable individual components and engineering them together, says Jianxiong Xiao, a professor at Stanford.

Xiao and Urtasun come from a different field, computer vision. Xiao argues that they bring with them a nimbler mindset, helped by big leaps since 2012 in the power of computers to understand images due to an AI technique called deep learning. Urtasun believes ideas from that world will be central to achieving the dreams of the field. Xiao is CEO of AutoX, a 40-person company that modifies cars to drive themselves, even in the dark or during rain, just by adding software and a few cameras.

AutoX has company in the form of Tesla. CEO Elon Musk says he can offer full autonomy without lidar, using the cameras and radar in Tesla vehicles today. Xiao argues that cameras and radar are rapidly becoming standard in cars through driver-assistance features, but it will take many years for the industry to integrate a new technology such as lidar.

Musk and Xiao are outliers, though. In October, GM and Ford each bought lidar companies to support their self-driving projects. Others in the field are pinning their hopes on the many companies working to develop new, cheaper forms of lidar.

“I’m eating popcorn watching the competition starting,” says Tarin Ziyaee, CTO of Voyage, a company whose lidar-equipped self-driving cars are being tested by residents of a retirement community in San Jose. Voyage vehicles currently sport a spinning $80,000 sensor on the roof. Ziyaee wants to pay much less, but argues that the sensors don’t have to become as affordable as a new muffler to make economic sense. Voyage, like Uber, hopes to operate fleets of robotaxis that serve many people and could thus earn back upfront costs quickly.

Uber is already operating some robotaxis in Pittsburgh and Phoenix—albeit with a human in the driver’s seat in case of problems. Alphabet’s Waymo division announced Tuesday that it has moved safety drivers to the to the back seat inside vehicles giving rides in Arizona. How quickly Uber can close that gap with Alphabet will depend on what happens in court next month—and Urtasun’s ideas for making self-driving technology smarter.