Advanced Computing in the Age of AI | Thursday, March 28, 2024

Nvidia To Use Virtual Reality for Autonomous Vehicle Testing 

Nvidia today revealed how it’s planning to use virtual reality technology to accelerate the testing of autonomous vehicles. The new offering, dubbed Drive Constellation, could dramatically improve the capability to test certain driving conditions, such as snow or blinding light, that can be hard to get in the real world.

“I think the challenge that real world testing has today is you can drive and drive and drive, and not see anything that really is that challenging,” Nvidia Senior Director of Automotive Danny Shapiro told journalists yesterday at the GPU Technology Conference (GTC) here in San Jose, California. “Cars are pretty good now at staying in the lanes and maintaining safe distance to the cars in front of them. But it’s those anomalies, those weird situations.”

Nvidia will use Drive Constellation offering to scale up the testing of the algorithms that autonomous vehicles use to make decisions. The solution, which is expected to be available in the third quarter, will combine two main products.

The first component in Drive Constellation will be the company’s Nvidia Drive Sim software, which simulates the data emitted from sensors used in an autonomous vehicle (AV), including visual cameras, thermal cameras, radar, and LIDAR. The second component will be Nvidia Drive Pegasus, the GPU-powered computer that actually processes the AV’s sensor data in real time.

Together, the components will allow Nvidia customers, such as Google and Uber, to significantly ramp up the number of miles driven in cyberspace before putting the AV on the road for a real-world test.

A recent study from Rand Corporation concluded that an AV needs the equivalent of billions of miles of road testing before it can be deemed safe enough to put into production in the real world. That would require huge fleets of AVs to hit the roads for testing, something that few people would be willing to put up with, especially in light of two fatal accidents in the past two weeks, including the Uber AV that killed a pedestrian crossing the street at night last week in Tempe, Arizona, and the Tesla Model X that hit a concrete guardrail last Friday just up the road in Mountain View, California (it’s unclear if the Tesla was in Autopilot mode; the Uber had a backup driver but was in AV mode).

Image from Nvidia Constellation AV simulator (Source: Nvidia)

Nvidia CEO Jensen Huang addressed the Tempe accident during an impromptu meeting with reporters following his two-and-a-half hour keynote address this morning. “I believe what happened, first of all…was tragic and sad,” he said. “It also is a reminder that’s the exact reason why we’re doing this. We’re developing this technology because we believe it will save lives.”

Following the accident, Nvidia and its partner, Uber, suspended AV testing while investigators piece together exactly what happened. The Drive Constellation product was already scheduled to be launched before the Uber accident, making the timing coincidental. But Huang said the accident was an example of why products like Drive Constellation were needed.

“I believe that as a result of what happened last week, the amount of investment into the seriousness of developing AVs is going to go up,” the CEO said. “Anybody who thought that they could get by without supercomputers and simulators and vast amount of data collection and all those engineers dedicated to making sure that this product is as safe as we can make it — that sensibility has completely changed.”

Shapiro said Drive Constellation will help Nvidia customers get closer to the billion-mile limit that Rand Corp. described, without compromising the safety of other people using the roads.

“What we’re able to do is create hazardous scenarios, or challenging scenarios,” he said. “If you were trying to train a vehicle how to drive in blinding sun, you basically have about 10 minutes a day where you can actually capture data that’s really challenging in that regard. Instead in simulations we can create 24 hours a day of blinding sun, and then we can test all different kinds of scenarios of people swerving or bicycles or pedestrians or whatever the situations is.”

Huang offered a couple of compelling demos of Drive Constellation during his keynote. In the first demo, he showed a simulated AV driving Interstate 280 through downtown San Jose — a potentially harrowing ride at the wrong times of day. The demo showed another car that had hit a light pole while police cars zoom by the car, with lights flashing. That brought up a question: how do AVs respond to police lights when they’re glaring in the AV’s cameras and sensors? Trying to simulate the red strobe effect of a police car would be quite difficult in the real world, but Nvidia can draw it up and execute that scenario any number of times and see how the AV’s algorithms handle it.

The second demonstration showcased just how far the combination of AVs and virtual reality (VR) can go. Nvidia brought one of its AVs down from its Santa Clara headquarters and parked it in a private parking lot just south of the San Jose McEnery Convention Center. The company also installed a driving simulator on the stage of the keynote address this morning. Finally, the company had Drive Constellations software running on one of its GPU-powered supercomputers back in San Clara.

As the scene unfolded, an Nvidia employee on the keynote stage piloted a virtual AV that was simulated on the “holodeck” at Nvidia headquarters. Meanwhile, the vehicle movements were transmitted to the actual AV in the parking lot behind the convention center. The Nvidia employee actually drove the AV, via the holodeck hookup, a short distance before parking the AV a quarter mile away.

“Mind blown,” Huang said.

About the author: Alex Woodie

Alex Woodie has written about IT as a technology journalist for more than a decade. He brings extensive experience from the IBM midrange marketplace, including topics such as servers, ERP applications, programming, databases, security, high availability, storage, business intelligence, cloud, and mobile enablement. He resides in the San Diego area.

EnterpriseAI