Lost in space?
How
lidar ensures robots know more about their surroundings By danielle lucey
W
arning: Objects in your mirror are closer than they appear. And robotics has the answer for bringing that archaic notion into the 21st century. Most drivers might currently use a series of mirrors to determine their surroundings, but for many robots, including the Google car, lidar is proving a better substitute than a quick glance and a prayer. “If you’re driving on the street and somebody passes you, you want to know if somebody comes from behind before you start a passing maneuver,” says Wolfgang Juchmann, product marketing manager at Velodyne Acoustic’s lidar division. “Essentially each time you look in your rearview mirror, you want to look backwards.” Velodyne Lidar’s sensors provide this capability on a lot of high-profile projects. It makes the sensor of choice for Google’s self-driving car program, Oshkosh’s TerraMax, Lockheed Martin’s Squad Mission
16
Mission Critical
•
Winter 2012
Support System and TORC Robotics’ Ground Unmanned Support System, to name a few. They also were tapped by rock band Radiohead to create their Grammy-nominated “House of Cards” music video. The company got its start as a spinoff of the DARPA Grand Challenges, where company founders David and Bruce Hall entered the competitions as Team Digital Audio Drive, or DAD. The brothers had previous robotics experience in competitions such as “BattleBots,” “Robotica” and “Robot Wars” in the beginning of the 2000s. After the first Grand Challenge, the Halls realized all the teams had a sensor gap they could fill. Stereovi-
SCAN IT
or
sion was not good enough for the task, so they invented the HDL-64 lidar in 2005 and entered the second Grand Challenge with the sensor, though a steering control board failure ended their run prematurely. By 2006, the company started selling a more compact version of the sensor, the HDL-64E. By then, the teams were gearing up for DARPA’s Urban Challenge event. Instead of entering the competition themselves, the brothers sold their device to other competitors. Five out of the six teams that finished used their lidar, including the top two teams.
Click IT:
Click or scan this barcode with your smartphone to see Radiohead’s “House of Cards” video, which was shot using Velodyne’s lidar. The video shows how many robots use the sensor to perceive their environment.
How lidar works Though the device proved a breakthrough in autonomous sensing technology, lidar is not a new concept. “The lidar itself is a technology that’s been around for a long time,” says Juchmann. “The laser beam hits an object and the object reflects light back. The time this takes tells us how far away the object is and the amount of light reflected back gives us an idea about the reflectivity of the object.” Lidar works in a similar way to radar, in that it measures the time it takes for a signal to return to its point of origin, though it ditches radio waves for laser beams. Because of the different nature of the two mediums, while radar excels at measuring faraway objects, Velodyne’s sweet spot is in the 100-meter radius range, says Juchmann. However, lidar overall has a better angular resolution.
Juchmann, is that instead of using one laser to determine an object’s range, it uses 64. “Instead of just shooting one laser to the wall, we shoot 64 all on top of each other so if you look at the wall you’ll see a [vertical] line of dots,” says Juchmann. “This means you can see a wall with a resolution of 64 lines in a vertical field of view of about 26 degrees.” Instead of measuring the time-todistance correlation of this series of dots at the same time, Velodyne measures them one after the other, in a series, to capture the distance data from each point. If you were shooting the lasers toward a flat wall, it would be a fairly easy measurement, says Juchmann, because the
What makes Velodyne’s product different than simple lidar technology, explains
Velodyne’s lidar mounted atop Google’s self-driving Lexus. Photo courtesy Google.
laser data would return almost simultaneously. However, if the series of laser points were flashed toward a staircase, it would mark faster returns on the lower-level stairs and longer returns as the steps ascend, giving the user an idea of the varying distances. The measurement of a single vertical line in space is not very useful though, especially to large cars trying to navigate their environment at fairly high speeds. Velodyne’s sensor also spins these 64 points, so there are 64 lines moving through the whole room. “The amazing part is the amount of data that is measured in a very short time,” he says. A human blink lasts about two-fifths of a second. In that time span, Velodyne’s lidar has done a 360-degree scan of its surroundings four times. This 10-times-per-second scan produces 1.3 million data points per second. At this speed, lidar can get in a centimeter’s range of accuracy in measuring an object’s location. While much older methods, like surveying, can measure an object’s accuracy in the smaller, millimeter range, highdefinition lidar’s speed versus breaking out some tripods is no contest. After the success of the company’s HDL-64E, it has also released the HDL-32E, which uses the same concept but uses 32 laser points instead of 64. This is useful for smaller vehicles, because Velodyne’s HDL-32E lidar weighs 1 kilogram, versus 15 kilograms for double the laser points. This is a huge factor when people want to mount their lidar on something lighter, explains Juchmann. It’s also less than half the price.
Mission Critical
•
Winter 2012
17
Lidar — continued from Page 17
How the Google car sees the path and obstacles ahead, using lidar integrated with other data and sensors. Photo courtesy Google.
To make all this data useful, companies integrate Velodyne’s lidar data with GPS and IMU data to determine how their robots should move.
“If you have a robot or a self-driving car that moves around, it’s important to see what’s around it,” says Juchmann.
“The vehicle needs to know where exactly it is,” says Juchmann. “Typically you have to feed in GPS information so you know where you actually are. With our sensor you can integrate and synchronize GPS information in order to determine not only the range, but also were you are.”
Not all of the technological aspects of lidar have been overcome. Lidar sensors are affected, the same way human eyes are, by low-visibility situations. For instance, the laser beam can detect drops of rain, but if the rain is heavy enough it might view a downpour as an object. Juchmann likens it to watching an antenna TV with some white noise.
The IMU compensates for movements and angles that inherently occur when the sensor is moved in real life. The key to all this data, though, is the software each company creates that analyzes it all. The Google self-driving car, for instance, integrates this data with its Google Maps product so the robot will know the long-range terrain data and also can detect if, for example, a bicyclist is coming up behind the car that is about to turn.
18
Mission Critical
•
Winter 2012
“You still see a picture, but only once in a while you have the full picture. If the rain becomes really, really heavy, you have more rain than picture.” The same is true for fog and snowfall. “If you have a little bit of that it’s all fine, and computer algorithms can figure out the once-in-a-while reflection, but if it’s heavy snowfall” the reflections will outweigh the actual picture, explains Juchmann.
Other applications Lidar has a lot of applications outside robotics. Right now, Velodyne is addressing the security and surveillance market, says Juchmann, which could use lidar to monitor military perimeters and border fences. Right now, many fences are monitored with cameras, which at their best have around 130-degree fields of view. Another big market that uses lidar is mobile mapping. Transportation department contractors put the sensors on manned vehicles and, using cameras and other sensors, give state transportation departments information on the conditions of bridges and roads. The accurate mapping provides an idea of roadwork and maintenance that needs to be done. AAI Textron uses Velodyne’s lidar on its Common Unmanned Surface Vehicle, to determine if there are intruders in the immediate vicinity and for collision avoidance.
Lidar — continued from Page 18 Aside from Google, Juchmann says nearly every single major car manufacturer in the world uses one or two of the company’s lidar to test out some of the other sensors that have made their way onto cars in the last 10 years. Auto companies will compare the results of the lidar with its backup warning signals, lane keeping and blindspot detection to measure their accuracy. Juchmann predicts, however, that the auto industry will be the big boon for lidar once they are adopted on every vehicle.
to get smaller. Also, many cars still rely on outdated computing technology that “isn’t adequate for modern sensor technology anymore.” While this isn’t a problem for Google, which uses its own computers, in traditional cars these old systems can be a bottleneck to how we actually use all this data.” And there is one more big hurdle. “People don’t want to have that thing on the top [of their car], and that’s where the balance between form and function needs to be found,”
lar problems have been solved in the past. He points to satellite radio, which originally required large antennas. “But at some point somebody made the decision that we’re going to have satellite radio inside the car. That’s the future we need, and the only way that’s physically going to work is to have an antenna on the outside,” he says. “Let’s come up with a design that doesn’t look really bad, so they came up with the shark fin design.”
“The next big step is to get integrated into the large volume products,” he says.
says Juchmann. The first thing all the car designers say is, “There’s no way in hell this thing is going to be on top of the car.”
The small fin is now on the back of most cars with satellite radio. The best spot for lidar remains at the top of a vehicle, though, so how this final challenge plays out is still a question.
For this to happen, the cost needs to come down and the sensors have
No one has come up with an answer for that yet, says Juchmann, but simi-
Danielle Lucey is managing editor of Mission Critical.
AAI/Textron’s CUSV uses a Velodyne lidar, the small sensor at the very top of the vessel, to image the maritime landscape. Photo courtesy Textron.
20
Mission Critical
•
Winter 2012