Advanced Computing in the Age of AI | Friday, March 29, 2024

The Future of Driving Comes to GTC 

<img style="float: left;" src="http://media2.hpcwire.com/dmr/hondaHUD.jpg" alt="" width="95" height="54" />In the past decade, in-car electronics have evolved from cigarette lighter-powered TomTom to sophisticated infotainment systems that integrate your phone, GPS and even your iPod. But based on the automotive seminars that took place at this week's GPU Technology Conference, navigation and hands-free calling may be considered downright primitive compared to what is in store.

In the past decade, in-car electronics have evolved from cigarette lighter-powered TomToms to sophisticated infotainment systems that integrate your phone, GPS, and even your iPod. But based on the automotive seminars that took place at this week's GPU Technology Conference (GTC), navigation and hands-free calling may be considered downright primitive compared to what is in store.

Victor Ng-Thow-Hing, principal scientist at the Honda Research Institute in Mountain View, Calif., demonstrated a heads-up display (HUD) that has crossed over into the realm of augmented reality. Rather than displaying the car's speed or turn-by-turn directions, Honda has been able to bridge the gap between 2D HUD and the 3D environment beyond the windshield.

The first example had street names projected onto the windshield such that they would appear as signs on that street's buildings. While this isn't strictly 3D, the system uses the same one-point perspective used by artists to make the text appear to be receding into space, thus making it easier for the driver to tell whether you're about to intersect a street or you're already on it.

Honda expects this will be especially useful for cities, where street signs are not always visible and building numbers aren't always big enough to read as you're driving by.

Ng-Thow-Hing also demonstrated two additional driver-assistance projections. The first is a nine-by-nine square grid that is designed to be a quick blind spot reference. In the center of the grid is a green triangle that designates the driver's car. Whenever an adjacent grid turns red, it means that there is a car in that spot on the road. While this would serve a primary purpose of making drivers more aware of their blind spots, it will also offer a quick and comprehensive view of the road overall.

The second projection was designed to help drivers more safely turn across intersections. Because drivers don't always correctly estimate the speed and proximity of oncoming cars, they can make mistakes that lead to accidents. This system uses sensors to measure speed and distance while an on-board processor (such as NVIDIA's specially designed Visual Computing Module) calculates whether it is safe for the driver to turn through the intersection. If the turn is unsafe the projected path of the driver's car will be displayed in red.

Audi is tackling a similar problem with its Urban Intelligent Assistant program, as presented by Mario Tippelhofer, an engineer at Volkswagen's Electronic Research Lab in San Francisco, Calif.

Audi's program is aimed at urban traffic, distractions to drivers and seemingly endless searches for parking that are common in cities.

One application that Tippelhofer previewed, “Smart Parking,” uses GPS to navigate you to the nearest available parking relative to the destination, rather than the destination itself. By taking the parking guesswork out of finding a place to leave your car, Audi hopes their system will help to cut down on the number of accidents caused by drivers who are scanning for parking spaces instead of watching the road.

In addition, Audi is developing systems for predicting traffic, helping drivers to safely change lanes, and detecting when a driver has stopped paying attention to the road.

Full story at CNET

EnterpriseAI