While not announced as part of this week’s GTC keynote, during the keynote itself word got out via Reuters that the company had suspended their active testing of their DRIVE autonomous vehicle driving system. Later confirmed and expounded upon by NVIDIA, the company was pausing public road testing of self-driving vehicles in light of last week’s fatal self-driivng Uber collision in Arizona. Noting that they wanted to be able to learn from the incident, they opted to stop and see what the investigation turned up.

And while the pause makes sense for both political and practical reasons, since then there’s bit a bit of confusion over just what NVIDIA’s involvement was with the Uber vehicle in last week’s incident and the timeline of their response. So as part of a GTC press Q&A session yesterday afternoon, NVIDIA CEO Jen-Hsun Huang clarified a few details about the incident – at least to as much as NVIDIA is able and willing to do.

On the hardware side of matters, NVIDIA is confirming that the Uber vehicle was using NVIDIA GPUs, but that it wasn’t using the company’s DRIVE platform. This matches earlier reports that NVIDIA hardware was in the vehicle. Uber for its part has been using NVIDIA GPUs in this fashion since 2016 – including in Arizona – well before the announcement of the closer partnership with NVIDIA back at CES 2018.

As a result the vehicles in Uber’s existing fleet are using NVIDIA’s GPUs in a commercial off-the-shelf capacity paired with other processors, all running Uber’s own software stack. This is as opposed to NVIDIA’s DRIVE platform, which utilizes NVIDIA processors throughout (SoCs and GPUs), with NVIDIA’s DRIVE software package running on top of that.

As one of the early leaders in the field, it’s of course in NVIDIA’s best interests to make sure that they avoid negative controversy, especially in the case of collisions. So the distinction between GPUs and DRIVE is for them quite significant; the accident doesn’t reflect any of their technology on sensor fusion, hardware fault tolerance, or the all-important neural networking-based software stack that actually acts on all of this data. Consequently however, it also means they’re largely out of the loop of the investigation; since it wasn’t NVIDIA’s platform, the investigation is being undertaken by Uber itself, which is part of the reason why NVIDIA’s comments have been so reserved.

Meanwhile as for the testing pause itself, Huang mentioned that the halt actually happened a couple of days earlier than Reuters first reported, with NVIDIA stopping “a day or two later” after the initial news came out. In the meantime NVIDIA is continuing to manually drive their test vehicles on public roads in order to continue collecting data and training its neural networks.

The event, while clearly not fortuitous for anyone, does none the less come at an interesting time for NVIDIA. The company has previously been developing their simulator systems for training neural networks – including self-driving cars – which they are now moving forward on as a commercial product with the DRIVE Sim and Constellation systems, which are set to launch later this year. It’s not clear if NVIDIA is already able to use the prototypes for practical training internally – and regardless real-world testing will resume sooner than later – but the Uber incident means that simulated training just became a much more valuable field, much to NVIDIA’s benefit.

Comments Locked


View All Comments

  • shabby - Thursday, March 29, 2018 - link

    Thanks uber, you incompetent fucks.
  • PeachNCream - Thursday, March 29, 2018 - link

    It's a shame that one incident may slow down the adoption of self-driving cars significantly when there are a lot of deaths each year contributed to distracted or unsafe driving by humans. It's absolutely terrible that someone was killed by an automated car and we need to take a look at the technology along with the entire series of events that caused the death to understand what happened and how to do better. However, I'm a lot more afraid of humans driving with their emotions and distractions than I am about even current self-driving cars. Just look at the number of people that can't control themselves at an intersection and feel like they have to do burnouts constantly without even a passing thought given to the extra danger they're putting everyone else in all around them.
  • Yojimbo - Thursday, March 29, 2018 - link

    Yeah.. people are afraid of anything new or unknown. We also live in an ultra safety conscious society. But people can't expect self-driving cars to never be involved in fatalities. Driving is an inherently dangerous thing. I doubt it will cause much of a slow-down in the adoption, though, unless lawmakers start to get in the way.
  • superunknown98 - Thursday, March 29, 2018 - link

    Whatever you think of Uber as a company not withstanding, I have read several articles now claiming the fatal accident was the pedestrians fault. The woman had suddenly bolted across the street with her bike and there wasn't enough time to brake. This is bound to happen again as cars have to obey the lays of physics, they are heavy and cannot stop in an instant. If a car traveling 30MPH takes 15FT to come to a complete stop and a pedestrian jumps out from between two parked cars 10FT ahead, guess what will happen? Pedestrian right of way laws make no sense to me, I always wait for traffic to clear before crossing a street.
  • Death666Angel - Thursday, March 29, 2018 - link

    All true it was almost assuredly the fault of the pedestrian crossing the road without checking for traffic in a dark area. And I doubt any human driver could have prevented the accident or outcome. But considering that the sensors should have better than human vision and the system should have better than human reflexes, it is still important to find out why it didn't notice the pedestrian and didn't react even when she was visible in the headlights.
    Do pedestrians have right of way in the US in all circumstances? In Germany, you have right of way when the pedestrian traffic light is green and on zebra crossings. The way the woman crossed that road, she would have been at fault for not obeying the traffic law.
  • nathanddrews - Friday, March 30, 2018 - link

    Each state in the US has slightly different traffic laws regarding right of way for pedestrians, but generally speaking if a pedestrian is in a crosswalk - in other words, not jaywalking - then they have the right of way. Someone crossing against a signal "do not walk/cross" or crossing at a random spot along the street does not have the right of way in most states. Pedestrian crashes are very rare overall and most of them do not occur at legal crossings. Alcohol and distraction is cited as a very common factor for the pedestrian being hit. Our meat sack bodies don't do very well against massive moving objects. Be safe out there.
  • r3loaded - Thursday, March 29, 2018 - link

    The pedestrian was definitely at fault, but the car should have been able to spot her on lidar/radar even in the dark and taken a defensive action (i.e. slam on the brakes, return control to the safety driver). That didn't happen, and we need to know why that didn't happen.
  • sl149q - Thursday, March 29, 2018 - link

    The issue is did Uber live up to the expectations of what an autonomous vehicle should be able to do. First at least matching what a human driver would have done, second exceeding that because of the additional capabilities.

    While it appears (at this point) that a human driver would have possibly killed or injured the pedestrian, there would have been some braking done by most people.

    If the Uber car had been operating at (minimally) the equivalent of a human driver the brakes should have been applied in the last two seconds (and apparently where not.)

    If the Uber car had been operating at our expectations of an autonomous vehicle it's lidar or radar systems should have "seen" the pedestrian more than five seconds out and should have slowed down. Again it appears that did not happen.

    This means that while the pedestrian was at fault, it is possible that some fault may also lie with Uber in a civil suit. It is likely that Uber will endeavor to avoid a suit with a settlement.
  • K_Space - Thursday, March 29, 2018 - link

    Looking at the video my sneaky suspicion is that the car DID in fact hand control back to the safety driver who may have been distracted but it's a pure speculation of my part.
    I feel this to be an inherent problem with the system though: handing control from a more safe system to a less safe system, what for? to avoid litigation? To be a devil advocate i would say in cases like this the car should NOT hand control back to the driver; it causes unnecessary delay in a time-critical scenario before an action is taken.
  • jjj - Thursday, March 29, 2018 - link

    The fact that they stopped testing is very suspicious though.
    It's the wrong thing to do , unless they know that something is off with their hardware or software.

Log in

Don't have an account? Sign up now