Jump to content

Self-Driving Uber Car Kills Arizona Pedestrian


Punching Bag

Recommended Posts

21 minutes ago, Mr Info said:

 

Please explain. Though long-distance autonomous car travel may not be at the forefront of early deployment design, there are plenty of articles that specifically call out long-distance autonomous car trips and the impact it may have on airline travel.

You still have to "man" it... Stay focused on safety.  No AI in world will accomplish that unless we are 100% autonomously compliant.  And human pedestrians on the road will never be!;)

 

It's 100% ALL in or give up NOT having to pay attention.

Edited by ExiledInIllinois
Link to comment
Share on other sites

1 minute ago, ExiledInIllinois said:

You still have to "man" it... Stay focused on safety.  No AI in world will accomplish that unless we are 100% autonomously compliant.  And human pedestrians on the road will never be!;)

 

It's 100% ALL in or give up NOT having to pay attention.

 

Pedestrians need to pay attention as well.  Like using appropriate crossing areas and paying mind to traffic when crossing, whether in an appropriate crossing area and especially when not. 

  • Like (+1) 1
Link to comment
Share on other sites

 

That car just kept chugging along.  Didn't brake, didn't flash its high beams, maybe (doesn't seem likely) blew its horn, didn't veer from its course by a millimeter.  Yes, the person shouldn't have been there, but a driver can't just plow through the random human (or any other object) in the roadway without doing ANYTHING.  And if you still need a human to monitor the driverless system, then just let the human drive.  Having someone wait,wait,wait until it is too late to take some evasive measures = a delay that makes an accident more likely, not less.

 

 

 

 

Edited by snafu
Link to comment
Share on other sites

1 hour ago, BeginnersMind said:

 

Good god why?

 

Speed it up. Some people will die. But computers will be better at driving than people. Way better.

Honestly, because I love to drive. Sure, driving in rush hour traffic sucks, but my 30 min drive to work in the morning is my time. My free time away from kids, wife etc. 

Link to comment
Share on other sites

1 hour ago, ExiledInIllinois said:

You still have to "man" it... Stay focused on safety.  No AI in world will accomplish that unless we are 100% autonomously compliant.  And human pedestrians on the road will never be!;)

 

It's 100% ALL in or give up NOT having to pay attention.

 

27 minutes ago, snafu said:

 

That car just kept chugging along.  Didn't brake, didn't flash its high beams, maybe (doesn't seem likely) blew its horn, didn't veer from its course by a millimeter.  Yes, the person shouldn't have been there, but a driver can't just plow through the random human (or any other object) in the roadway without doing ANYTHING.  And if you still need a human to monitor the driverless system, then just let the human drive.  Having someone wait,wait,wait until it is too late to take some evasive measures = a delay that makes an accident more likely, not less.

 

If you have to "man" it then it is not autonomous. Someday, perhaps when you are older and your vision and reaction time are not as capable as now, you may want to go somewhere. Your ability to 'man' the vehicle would increase the odds of an accident if you decide to take over. The focus for autonomous vehicles is on safety. People make mistakes...this is being implemented to avoid the distraction, response time, etc. that people incur which leads to accidents. The autonomous system will not be able to read my mind if I come flying out of the dark into an intersection at 20 mph on my bike or run onto a freeway. But that's my bad if I get hit by a car whether with a driver or driverless. Of course, avoiding collisions should be the ultimate goal regardless of whomever/whatever is driving but sometimes people on or near roads are not cautious and do not think through their decisions and something untoward occurs to them. Certainly, the fault should not fall upon the 'driver' in these instances.

 

15 minutes ago, Steptide said:

Honestly, because I love to drive. Sure, driving in rush hour traffic sucks, but my 30 min drive to work in the morning is my time. My free time away from kids, wife etc. 

Anyone that likes to drive should be able to continue to do so even when autonomous cars become commonplace. I expect that when autonomous vehicles become the norm that insurance will become cost prohibitive for most people. Unfortunately, people will make far more driving errors than autonomous vehicles and this will be reflected in their insurance rates.

  • Like (+1) 1
Link to comment
Share on other sites

2 hours ago, Doc said:

 

Pedestrians need to pay attention as well.  Like using appropriate crossing areas and paying mind to traffic when crossing, whether in an appropriate crossing area and especially when not. 

But, they do make irrational decisions, should they die for their mistake when a human on the other end can prevent that death or at least lessen harm?

 

In this case, she was toast no matter what.  Would have been a goner with a human in full control.  IMO from viewing the exterior shot of dash cam.

3 minutes ago, Mr Info said:

 

 

If you have to "man" it then it is not autonomous. Someday, perhaps when you are older and your vision and reaction time are not as capable as now, you may want to go somewhere. Your ability to 'man' the vehicle would increase the odds of an accident if you decide to take over. The focus for autonomous vehicles is on safety. People make mistakes...this is being implemented to avoid the distraction, response time, etc. that people incur which leads to accidents. The autonomous system will not be able to read my mind if I come flying out of the dark into an intersection at 20 mph on my bike or run onto a freeway. But that's my bad if I get hit by a car whether with a driver or driverless. Of course, avoiding collisions should be the ultimate goal regardless of whomever/whatever is driving but sometimes people on or near roads are not cautious and do not think through their decisions and something untoward occurs to them. Certainly, the fault should not fall upon the 'driver' in these instances.

 

Anyone that likes to drive should be able to continue to do so even when autonomous cars become commonplace. I expect that when autonomous vehicles become the norm that insurance will become cost prohibitive for most people. Unfortunately, people will make far more driving errors than autonomous vehicles and this will be reflected in their insurance rates.

Fair reasoning.

 

Question?  In this situation... Given the facts, should that driver have his rates go up?

Edited by ExiledInIllinois
Link to comment
Share on other sites

23 minutes ago, Mr Info said:

 

 

If you have to "man" it then it is not autonomous. Someday, perhaps when you are older and your vision and reaction time are not as capable as now, you may want to go somewhere. Your ability to 'man' the vehicle would increase the odds of an accident if you decide to take over. The focus for autonomous vehicles is on safety. People make mistakes...this is being implemented to avoid the distraction, response time, etc. that people incur which leads to accidents. The autonomous system will not be able to read my mind if I come flying out of the dark into an intersection at 20 mph on my bike or run onto a freeway. But that's my bad if I get hit by a car whether with a driver or driverless. Of course, avoiding collisions should be the ultimate goal regardless of whomever/whatever is driving but sometimes people on or near roads are not cautious and do not think through their decisions and something untoward occurs to them. Certainly, the fault should not fall upon the 'driver' in these instances.

 

Anyone that likes to drive should be able to continue to do so even when autonomous cars become commonplace. I expect that when autonomous vehicles become the norm that insurance will become cost prohibitive for most people. Unfortunately, people will make far more driving errors than autonomous vehicles and this will be reflected in their insurance rates.

Dude, haven't you seen I robot? 

Link to comment
Share on other sites

22 minutes ago, ExiledInIllinois said:

But, they do make irrational decisions, should they die for their mistake when a human on the other end can prevent that death or at least lessen harm?

 

In this case, she was toast no matter what.  Would have been a goner with a human in full control.  IMO from viewing the exterior shot of dash cam.

Fair reasoning.

 

I'm only talking about this instance.  No one could have missed hitting her.

Link to comment
Share on other sites

30 minutes ago, Doc said:

 

I'm only talking about this instance.  No one could have missed hitting her.

Yep!

 

I wonder what the data shows, if the computer reacted prior to impact.  In strange way, maybe it did.  That would make it better than a humans reaction?  Yep she bought the farm, but may have received a slightly gentler blow.  The interior dash cam was of the driver reacting to the deceleration of vehicle.  Why he looked up? 

Link to comment
Share on other sites

1 hour ago, ExiledInIllinois said:

 

Question?  In this situation... Given the facts, should that driver have his rates go up?

 

I will give it a shot but this could change if more facts are brought forward. Information has been revealed that two things occurred prior/at the accident: the autonomous car applied brakes but not quickly enough to avoid the accident and the driver was looking down and not ahead. It would have been interesting to determine if the driver would have applied brakes and/or grabbed the wheel quicker than the autonomous driver reacted to the scenario. I believe the result would have been the same even if the driver were looking forward. The fault would have been on the person crossing and not whoever the driver was.

 

But your question is interesting. If one owns an autonomous care and it gets in an accident and it is determined to be 'at fault' due to a software glitch or some other unknown reason, does the vendor become responsible for any resulting extraneous expenses? 

 

This question is already being addressed (from this link: https://venturebeat.com/2018/01/26/3-ways-self-driving-cars-will-affect-the-insurance-industry/ ). Google, Volvo, and Mercedes-Benz already accept liability in cases where a vehicle’s self-driving system is at fault for a crash. Tesla is taking things a step further by extending an insurance program to purchasers of Tesla vehicles. This offering shows an extraordinary level of confidence in the technology. Current data even indicates that preventable human error is the cause of as many as 94 percent of all accidents.

 

 

Link to comment
Share on other sites

On 3/24/2018 at 7:14 PM, Limeaid said:

 

As long as you are in training for cars targeting system sure.

 Irrelevant. People think that because they are in control, they will be better drivers. Such an assumption is baseless now and in very short order will not only be wildly wrong, but also dangerously so. 

 

Self driving will be a dangerous anachronism in a decade and have gone the way of the dodo in 2 decades. My children's children will not drive cars. 

Link to comment
Share on other sites

On 3/22/2018 at 9:05 PM, Mr Info said:

Current data even indicates that preventable human error is the cause of as many as 94 percent of all accidents.

 

 

 

I am surprised it's that low, frankly.  Except in cases of major mechanical defect (e.g. Firestone tires), most every accident can be traced back to somebody !@#$ing up.  Last time my brakes failed, for example, it wasn't the brakes that were the problem, it was the idiot mechanic who only hand-tightened the reservoir drain plug.

Link to comment
Share on other sites

7 minutes ago, DC Tom said:

 

I am surprised it's that low, frankly.  Except in cases of major mechanical defect (e.g. Firestone tires), most every accident can be traced back to somebody !@#$ing up.  Last time my brakes failed, for example, it wasn't the brakes that were the problem, it was the idiot mechanic who only hand-tightened the reservoir drain plug.

 

Are brake failures a common occurrence for you? :)

 

I had it happen once and it was a brake line failure.  Maybe I lead a charmed life, I had the Firestone tires but they didn't fail on me.

 

Anyway, the 94% preventable stat is interesting. I wonder how it includes weather-related accidents where the human error may have been to be out driving in the first place.

 

Link to comment
Share on other sites

×
×
  • Create New...