Wednesday, March 28th 2018

NVIDIA Temporarily Halts Self-Driving Tests in Wake of Recent Crashes

(UPDATE 29MAR2018, 14H34): It has been confirmed that the UBER vehicle that suffered the crash in Arizona, mentioned in this news piece, was indeed running NVIDIA hardware. However, this hardware capability was apparently acquired via of-the-shelf acquisitions by UBER, and didn't employ NVIDIA's DRIVE platform. The autonomous driving capabilities of the UBER vehicle were instead handled by NVIDIA hardware, handled fully via UBER's own, proprietary software stack. NVIDIA, thus, is looking to distance itself from the even as much as possible when it comes to its DRIVE platform, so as to avoid any unwarranted bad press upon the system on which they pin such high hopes for the market.

Even though NVIDIA's CEO Jensen Huang has been delivering keynotes at this year's GDC that mainly focus on AI-driven workloads and system training, the company has decided to put a temporary stop to self-driving tests using its technology. This decision comes in the wake of the UBER crash, which killed a woman in Arizona just last week. To add some more heat to the mixture, it has just been announced that the NTSB is opening an investigation towards the accident involving a Tesla motor vehicle (although at the time, it isn't known whether the car was in self-driving mode or not).

As a result of these events, which saw Tesla and Uber's stock valuation decline, NVIDIA has decided to halt all self-driving tests using its integrated NVIDIA DRIVE platform. Some 370 interested parties - from developers to companies - are currently exploring self-driving solutions, and this decision is sure to put a stop to their work. However, it does have to be said that if anything, these events contribute to the notion that these systems aren't as of yet ready for deployment - should the fault lie completely with the automated driving mechanisms, of course. Let's not forget that human drivers make mistakes - and sometimes do exactly as intended - that also kill human beings in a much, much more expressive way.
Sources: Yahoo Finance, Reuters
Add your own comment

23 Comments on NVIDIA Temporarily Halts Self-Driving Tests in Wake of Recent Crashes

#1
Casecutter
That system in AZ couldn't even pick out a person walking a bike...

Being driven in one those believing you'll intervene it a cray job. While that video of the driver appears they where looking down totally un-aware (texting...). Even if you are watching and thinking about taking control, you have to think is the system going to do the right thing, if I intervene to quickly will the designers say oh they should have let the system do it... as it would've? Or, while thinking about "all that" in a split second will end up still in some accident.

It could mess you up for life... thinking you kill someone, at least that person can rest assure as they never had a chance of considering any of those options.
Posted on Reply
#2
AsRock
TPU addict
I hear that the accident happened due to UBER disabling vital options in the system, i guess time will tell.
Posted on Reply
#3
HD64G
Allow me to differ from the last sentence. Robots aren't humans to have the right to free choice. And thus, they shouldn't be able to handle human lives directly (surgery) or indirectly (selfdriving vehicles). Only helping humans in producing procedures. They can be hacked as we all know. As for airplane's autopilot it is on along with the pilots contant presence and caution giving him time to react in case of autopilot's misoperation. In cars there is almost no time to react in case something goes wrong with the selfdriving. Food for thought.
Posted on Reply
#4
Xzibit
WSJ: Experts Break Down the Self-Driving Uber Crash
Posted on Reply
#5
R-T-B
HD64GAllow me to differ from the last sentence. Robots aren't humans to have the right to free choice. And thus, they shouldn't be able to handle human lives directly (surgery) or indirectly (selfdriving vehicles). Only helping humans in producing procedures.
As my fathers life was essentially saved by a robotic surgery, I'm not sure I can agree with that at all.

Granted, a human was driving the robot in my fathers instance. But the idea and concept remains the same. They have uses in the roles you specified.
Posted on Reply
#6
Space Lynx
Astronaut
Nvidia has been relying heavily on this tech to boost its stock price, seems like they are shifting hardcore to the medical world now though, hehehe At least they are smart and move fast, even if I don't like their premium prices.
Posted on Reply
#7
intelzen
XzibitWSJ: Experts Break Down the Self-Driving Uber Crash
please watch this video before you judge "what went wrong" and "how the sky could fall in future".
and please judge human abilities correcty - in this situation (because of visibility and the fact that pedestrian just walks in front of moving car, like it did not exist) any human would have also "killed" that pedestrian.

It is very sad that we live in age where everyone (and there fore Mainstreammedia) can get triggered by event without knowing any facts about it (and at this point - not even trying to know or analyze any facts) and because of mass triggering - progress can be stopped and more human lives can be lost in future (dont tell me that humans do not or will not cause fatalities in traffic - on average 35 000 deaths in US alone per year)
Posted on Reply
#8
Xzibit
intelzenplease watch this video before you judge "what went wrong" and "how the sky could fall in future".
and please judge human abilities correcty - in this situation (because of visibility and the fact that pedestrian just walks in front of moving car, like it did not exist) any human would have also "killed" that pedestrian.
Remember that is only the perspective of the on-board camera.

0:33-035 would be the area where she was hit


Here is another video of the accident area 0:33-0:35


Bloomberg: Human Driver Could Have Avoided Fatal Uber Crash, Experts Say

Forensic crash analysts who reviewed the video said a human driver could have responded more quickly to the situation, potentially saving the life of the victim, 49-year-old Elaine Herzberg. Other experts said Uber’s self-driving sensors should have detected the pedestrian as she walked a bicycle across the open road at 10 p.m., despite the dark conditions.

Zachary Moore, a senior forensic engineer at Wexco International Corp. who has reconstructed vehicle accidents and other incidents for more than a decade, analyzed the video footage and concluded that a typical driver on a dry asphalt road would have perceived, reacted, and activated their brakes in time to stop about eight feet short of Herzberg.

The woman can be seen taking several steps while visible and appeared to be moving at a normal walking pace as she’s crossing the road outside of a crosswalk and does not look up at the SUV. Police have said the car didn’t slow or swerve to avoid the impact.


Herzberg becomes visible in the car’s headlights as she pushes a bicycle across the road at least two seconds before the impact. "This is similar to the average reaction time for a driver. That means that, if the video correctly reflects visible conditions, an alert driver may have at least attempted to swerve or brake," Smith said.

Moore, the forensic engineer at Wexco, said dashcam videos tend to understate what humans drivers can see. While the pedestrian appears from the shadows in the video, a human driver may have had a better view if they’d been watching, he said.

Sean Alexander, of Crash Analysis & Reconstruction LLC, concurred. "Video makes everything in the light pattern brighter and everything out of the beam darker. A human eye sees it much clearer," he said.

Seeing a few seconds of video raises more questions than answers,” said Hersman, who also served as chairman of the NTSB.
Posted on Reply
#9
jabbadap
XzibitWSJ: Experts Break Down the Self-Driving Uber Crash
Pedestrians should really use safety reflectors.

While I should not joke around when someone actually died. But this reminds me Volvo's automatic brake test failure in the past(Yeah I know this has nothing to do car maker this time):
Posted on Reply
#10
Casecutter
intelzenplease watch this video before you judge "what went wrong" and "how the sky could fall in future".
and please judge human abilities correcty - in this situation (because of visibility and the fact that pedestrian just walks in front of moving car, like it did not exist) any human would have also "killed" that pedestrian.

It is very sad that we live in age where everyone (and there fore Mainstreammedia) can get triggered by event without knowing any facts about it (and at this point - not even trying to know or analyze any facts) and because of mass triggering - progress can be stopped and more human lives can be lost in future (dont tell me that humans do not or will not cause fatalities in traffic - on average 35 000 deaths in US alone per year)
Ah, did you see inside dash cam view of the person sitting behind the wheel... it is super clear.
www.abc15.com/news/region-southeast-valley/tempe/watch-tempe-police-release-video-of-deadly-uber-crash

As to the "system" it for whatever the reason never saw or judged that obstacle, and there's no reason all those sensors and such should've ever missed... yes that stupid person. I'm not saying a driver clearly focused on the task of driving would've not hit that person walking a bike, but at least "I" in all probability been able to brake and steer away, that "system" did no evasive maneuver.

The technology/system has a flaw or broke down. The technology has been sold to everyone that the odds of such an instance like this as next to non-existent. Like an individual being hit by a meteorite.
I would say it's not.
Posted on Reply
#11
HD64G
R-T-BAs my fathers life was essentially saved by a robotic surgery, I'm not sure I can agree with that at all.

Granted, a human was driving the robot in my fathers instance. But the idea and concept remains the same. They have uses in the roles you specified.
OK then. Let me rephrase my opinion on the part about surgery: "I agree with the use of robots for health reasons (surgery for instance) only when the human is in great danger of his life and the doctor is far to do that on time to save him without the robotic arm's help. And I disagree to use robots in surgery when no one is in great danger of his life".
Posted on Reply
#12
R-T-B
HD64GOK then. Let me rephrase my opinion on the part about surgery: "I agree with the use of robots for health reasons (surgery for instance) only when the human is in great danger of his life and the doctor is far to do that on time to save him without the robotic arm's help. And I disagree to use robots in surgery when no one is in great danger of his life".
I'm not sure I agree with that either, but I have a feeling we're just too far apart on this.
Posted on Reply
#13
csgabe
They should redirect their systems to mine anything.
Posted on Reply
#14
Xzibit
CasecutterAh, did you see inside dash cam view of the person sitting behind the wheel... it is super clear.
www.abc15.com/news/region-southeast-valley/tempe/watch-tempe-police-release-video-of-deadly-uber-crash
Certain people are taking the dashcam to be a representation of actual conditions.



The drivers cabin cam shows greater detail of the street if you look out the driver side window. You can see the bushes not just this area in detail but throughout the entire interior video.

Ars technica: Police chief said Uber victim “came from the shadows”—don’t believe it
Posted on Reply
#15
Markosz
That's just stupid... Why would one accident make them stop?
More than a thousand people day by traffic accidents caused by HUMAN ERROR, by that logic no one should ever drive?
Posted on Reply
#16
Fluffmeister
MarkoszThat's just stupid... Why would one accident make them stop?
More than a thousand people day by traffic accidents caused by HUMAN ERROR, by that logic no one should ever drive?
For sure, human stupity will have AI baffled for years to come.

But it's only sensible for all parties involved in the development of such technology to take a step back... including Nvidia despite the fact their drive technology wasn't used.

Let's just be thankful she wasn't Sarah Connor.
Posted on Reply
#17
Arjai
This is sad. This person died. Why? Analytics aside, they died because of inattention to the road ahead by the driver of the car. If you are sitting behind the steering wheel of a car, there is no excuse for killing a pedestrian. Use up all your silly scenario's, and what if's.

We are are ALL pedestrians!! When you are in a car, driving, it is a responsibility, RESPONSIBILITY, to not kill Pedestrians, or other Motorists or any Bicyclists or Biker's !!

I don't think self driving cars should be allowed to drive unattended, in ANY METRO AREA! Road tripping through Kansas? Hand me another Beer! LOL!

Letting a computer drive you in the city means, to me, you have no consideration for anyone else, at all. I have seen pedestrians do some stupid shit, (cross the street in front of a train, anyone?) but I have seen WAY more automotive driver's do stupid shit!! I think I would trust a drunk to walk to the store and back. But not some of the people I have seen driving!

So, be sober, behind the wheel. Be courteous, be attentive and don't weaponize your vehicle!

:lovetpu:
Posted on Reply
#18
HTC
jabbadapPedestrians should really use safety reflectors.

While I should not joke around when someone actually died. But this reminds me Volvo's automatic brake test failure in the past(Yeah I know this has nothing to do car maker this time):
@ the very least, they should use clothes with lighter colors.

I use a bike to go to work everyday, regardless of weather conditions, and i always use a reflective piece of clothing. Yesterday i wore two because it was raining, so i had to also use the lower part of my reflective gear: I wan't to be clearly seen when going to / from work.

That said, i've seen the video posted by @Xzibit and i'm not entirely convinced a human driver could have avoided that pedestrian because you only see her when you're almost hitting her: perhaps a swerve could have avoided hitting her full on but, judging by the onboard camera footage, there's very little time between when she's seen and when she's hit.

According to those that speak in this video, Lidar should have picked up the presence of the pedestrian, regardless of lighting conditions, long before a human driver could, so i'm guessing something definitely didn't work as it should.
Posted on Reply
#19
medi01
Perhaps relevant;
2 + 2 = 4, er, 4.1, no, 4.3... Nvidia's Titan V GPUs spit out 'wrong answers' in scientific simulations
XzibitCertain people are taking the dashcam to be a representation of actual conditions.
Frankly, ,I didn't even need to see other pictures to call bullshit.
Victim crossed 2 lanes, slowly.
It doesn't matter if road was illuminated or not, car's lights are more than adequate to see the pedestrian from a distance far enough to stop.
MarkoszThat's just stupid... Why would one accident make them stop?
Because their system failed miserably in an very ordinary situation.
MarkoszMore than a thousand people day by traffic accidents caused by HUMAN ERROR, by that logic no one should ever drive?
That's caused by hundreds of millions of people driving.
If we would let Uber's piece of shit in those numbers, we'd likely see even more victims.
Posted on Reply
#20
Casecutter
Two things "shadows or condition" have nothing to do why the sensors and system, it just never intervened (it failed). Those sensor are designed to look through, and not have our human "shortcoming". I don't advocate ceasing the systems development, but just like the FFA until you know what went wrong they have to be prudent and due diligence.

Also as I started with the first post, having humans sitting behind the wheel waiting for the unexpected and then loosing milliseconds while thinking, " is the car going do what it's suppose to, or should do something... or, will the engineers who have me sitting here reprimand me for doing something... or not doing?

My son had his first accident and he's shaken... it was bad. He intends he was not distracted (which he's not like that, but sure parents can never be positive), luckily no one was severely hurt. Though a non-focused second leads to a bad deal, so yes human's are very fallible. He's hard on himself now, he'll have to get over it and get back behind the wheel.

What I start here... was tabling the idea that they put humans to "over-watch" these cars like that lady was doing... is one hell of a job! Lets face it not any of those folk in that job (Uber) are top notch engineering type's just riding around at night.
Posted on Reply
#21
Xzibit
PA: NTSB Investigating Tesla Crash Suspected of being on Autopilot Killing Walter Huang, a new Apple Engineer

Apple engineer Walter Huang died on Friday after his new Tesla Model X he was driving crashed into a barrier in Mountain View, California.
"Before the crash, Walter complained 7-10 times the car would swivel toward that same exact barrier during autopilot. Walter took it into dealership addressing the issue, but they couldn't duplicate it there."
Posted on Reply
#22
Hood
XzibitPA: NTSB Investigating Tesla Crash Suspected of being on Autopilot Killing Walter Huang, a new Apple Engineer
This guy was definitely at fault, especially since he knew about the mysterious glitch in the system that caused the autopilot to veer towards solid concrete barriers. As we used to say in the auto repair business, "There's a screw loose in the nut behind the wheel". Why would he trust the wonky autopilot with his life? He was a new Apple engineer, so he may have been on the team trying to finish their upcoming self-driving car, and was hacking or reverse engineering the Tesla's system, and introduced the problem himself (wouldn't that be an ironic twist?). I hope to see more about this, something is missing from this story. This is a major setback for the technology, which has apparently sent Nvidia back to the drawing board to fix their PX2 system. Maybe this is why tesla is developing their own in-house A.I. driving system, and dropping Nvidia's. The robots have begun the killing...
Posted on Reply
#23
Xzibit
Electrek: Tesla owner almost crashes on video trying to recreate fatal Autopilot accident
ElectrekThen it seems like Autopilot’s Autosteer stayed locked on the left line even though it became the right line of the ramp. The system most likely got confused because the line was more clearly marked than the actual left line of the lane.
That led the car directly into the barrier and it’s easy to see how a driver who is not paying attention couldn’t have been able to react in time since the driver who recreated it was barely able to apply the brake in time himself.
Tesla Auto Pilot tries to kill occupants!!!
Posted on Reply
Add your own comment
Dec 20th, 2024 00:43 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts