Advanced Computing in the Age of AI | Thursday, March 28, 2024

Nvidia on the Spot: Answering Autonomous Driving Ethical Questions 

source: Nvidia

We’ve all heard the question: An autonomous car carrying an adult comes around a blind turn and detects a child in the middle of the road. To the left is a cliff, to the right are pedestrians on the sidewalk. The car must make a snap decision. What will, what should, it do?

At Nvidia’s annual GPU Technology Conference (GTC) last week in San Jose, we put the ethical quandary to the refreshingly candid Danny Shapiro, the company’s senior director of automotive, after a press briefing that included announcements about the extension of Nvidia’s partnership with Toyota, the world’s largest car maker. Part of the partnership will include Toyota becoming the first customer to adopt of Nvidia’s new DRIVE Constellation platform, a simulation and test environment.

Nvidia said Toyota’s research and development arm, the Toyota Research Institute-Advanced Development, will use the Constellation platform to simulate the equivalent of billions of miles of autonomous vehicle (AV) driving in challenging scenarios. “The cloud-based solution enables millions of miles to be driven in virtual environments across a broad range of scenarios — from routine driving to rare or even dangerous situations — with greater efficiency, cost-effectiveness and safety than what is possible in the real world.”

Nvidia's Danny Shapiro

And that is part of Shapiro’s answer to the AV ethics question. Over the longer term, cars will become smarter, will have multiple layers of sensory technology to better, and more quickly, analyze driving scenarios. But for the shorter term, he said, “I don’t have a good answer.” By this he meant several things: for one, it will be up to the car maker using Nvidia technology, and not Nvidia itself, to program the car for ethically challenging (and unlikely) situations. He also made the point that overall, AV will improve public safety and save thousands of lives annually.

“I don’t have a good answer, because we’re creating the underlying technology, our automaker companies will decide,” he said. And he noted that car companies have discussed ethical questions “but they realize if they’re not prioritizing the occupants of the vehicle they’re not going to sell very many cars, so everyone’s shying away from this a little.”

Fortunately, the discussion does not end there. This is because over time, AV will become increasingly safe as more safety measures are designed into cars and as the autonomous driving infrastructure is built out. That infrastructure could eventually include car-to-car communications, so that a forwardly placed car will notify cars behind when a danger (black ice detected by the car's traction system, flooding, pedestrians in the road) lies ahead. It also could include car-to-infrastructure communications in which cameras placed on telephone poles could, for example, detect sources of danger around an upcoming blind turn and notify approaching cars.

“I think being able to predict human behavior and crazy stuff that (people do) is impossible, so what we can do is have advanced sensing and actuation to mitigate (problems),” he said. “I mean if we have just autonomous cars on the road it will be easy not to have any accidents. But if you have other human-driven vehicles or pedestrians or whatever out there, people do crazy stuff.”

Shapiro also offered the common-sense notion that cars should proceed slowly around blind turns, providing time to stop the car if a child is in the road.

Working out solutions to ethical and other AV problems will require extensive simulation and testing, which is what DRIVE Constellation is designed for. It’s a data center solution comprised of two side-by-side servers. The first server — DRIVE Constellation Simulator — generates the sensor output from the virtual car. The second server — DRIVE Constellation Vehicle — contains Nvidia’s DRIVE AGX Pegasus AI car computer, which receives the sensor data, makes decisions and sends vehicle control commands back to the simulator.

“This closed loop process enables bit-accurate, timing-accurate hardware-in-the-loop testing,” Shapiro said. “DRIVE Constellation can achieve massive amounts of driving experience — 3,000 units can drive over 1 billion miles per year. More importantly, each mile driven in DRIVE Constellation contains events of interest — including rare or hazardous scenarios.”

Nvidia has used Constellation internally for a year, and now Toyota is the first customers to adopt it. Shapiro said Nvidia is opening the Constellation platform by  providing a programing interface that allows Nvidia AV ecosystem partners to integrate their environment models, vehicle models, sensor models and traffic scenarios, allowing “other companies to build on our DRIVE Simulator software and run it on Constellation.”

He said Israel-based simulation company Cognata is developing traffic and scenario models supported by the Constellation platform. Cognata uses real-world data captured by traffic cameras around the world to create a large-scale traffic model. Developers can define the number of other vehicles and road users, as well as their behavior, along with weather conditions, based on real-world traffic data.

“They’re taking data from traffic cameras and we can import all those scenarios in that traffic into the simulator, so we can simulate how our autonomous vehicle will respond to real world traffic,” Shapiro said.

Also at GTC was Nvidia AV partner, Clarion, which demonstrated its “autonomous summon” technology, running on Nvidia DRIVE AGX Xavier that enables a car owner to call a parked, driverless vehicle to a pick-up spot. Guardian Optical Technologies demonstrated its in-cabin sensor to detect passengers. The system sends alerts if a child or pet is left behind in the back seat. Chinese startups WeRide and AutoX announced they will use Nvidia’s DRIVE AGX Pegasus to deliver Level 4 autonomous taxi and delivery services. AutoX said its prototype Level 4 AV is performing self-driving delivery in a San Jose pilot project. And autonomous trucking company TuSimple unveiled its automotive-grade camera for night and low-light vision, which the company said is a key component for production-level autonomous trucks.

Nvidia at GTC also launched Safety Force Field (SFF), a part of its DRIVE AV software suite. SFF is a planning and control layer that includes “driving policy,” analyzing and predicting the dynamics of the surrounding environment by taking in sensor data and determining actions to protect the vehicle and other road users. Nvidia said the SFF framework “ensures these actions will never create, escalate or contribute to an unsafe situation and includes actions necessary to mitigate potential danger…. SFF makes it possible for vehicles to achieve safety based on mathematical zero-collisions verifications, rather than attempting to model the high complexity of real-world scenarios via limited statistics.”

These and myriad other technologies, now and in the future, will join the increasingly ubiquitous AV ecosystem.

“This technology will continue to evolve,” Shapiro said, “continue to get better and better.… People talk about 5G, it’ll be a while before we get really useful data, eventually it will be good, but it’s kind of like the fax machine when it started. How many cars will have vehicle-to-vehicle communications, not that many, so you can’t rely on it. But those are the kind of things at some point we may say the benefits are so great that every new car has to have it and old cars have to be retrofitted.”

EnterpriseAI