Jump to content
Sign in to follow this  
eveln

Who is guilty if a driverless car runs over and kills a person ?

Recommended Posts

So your automated car is travelling along carrying you and your family ( partner and two offspring. All is fine till a pedestrian chooses to cross the road - the car is programmed to save the most humans possible - hence the auto function diverts the car and one pedestrian dies in order to save four people.

Now in this day and age with humans sitting in the driving seat and doing the driving, pedestrians are never classed as being in the wrong ( legally speaking ) I think .  So if you hit a person whilst driving your vehicle, you be in deep shit legally. How or who do you charge in regard to
the automated ( driverless ) car ? Not forgetting here, that the car's programming reflexes are far superior to that of the human, so even if the car was programmed to allow for the human to overide the auto, you'd have to figure the human reflex to be slower, yes ?...
and would probably make much moar of a mess of the collision.

What say, the driverless car is programmed to save all life within the vehicle ( pre-supposing those in the car are family of the owner ) regardless of whether an accident would hurt, perhaps kill more than those inside the vehicle ?
I mean one of the sales pitches for the driverless car is to lessen the road toll ...

All of the above is assuming the car has behaved according to road rules at all times and done it's automated-best to follow it's programming ... I guess that means if the pedestrian survives the collision they then go up on charges for infringement of the road rules.


 

Edited by eveln

Share this post


Link to post
Share on other sites

Q: Should a driverless car sacrifice the lone occupant if it meant saving four pedestrians?

A: Absolutely

 

Q: Would you buy a car that was programmed to kill you to save the lives of four pedestrians who didn't look before crossing the road?

A: Fuck no

  • Like 1

Share this post


Link to post
Share on other sites

That's part of the reason they'll never become mainstream.  The legalities, and the fact that nobody has ever programmed an infallible complex system.  Consumer OSes, aircraft and spacecraft control systems, industry automation, building security systems to name just a few.

And the hype - it's the same bullshit crowd that have been saying cold fusion is 20 years away since the mid 1970s.

  • Like 1

Share this post


Link to post
Share on other sites

driverless cars don't kill people!  programmers kill people, with driverless cars!

the ethical dilemmas are a gigantic fly in the ointment.  thats why, as much as i am fascinated with all the developments, ive never been bullish about them.  soooo much naive underestimation of the complexities, even amongst the smartest people.

there's already been a number of incidents.  and a lot more people gotta die before they get this stuff figured out.  it may take these cars becoming mainstream to do that, which of course is a chicken and egg thing.  looking at it with cold pragmatism, i suppose the ultimate death toll will be minuscule compared to the lives it took to refine the safety of traditional cars — but that is no consolation if its you and yours.  its so strange that being bulldozed by a rogue computer weighing a tonne has become a form of natural selection.

we seem to be proceeding in willful ignorance of the fact we could barely expect even an on-board Artificial General Intelligence to grapple with the thorniest in-the-moment moral decisions.  and in the meantime, through either acts or omissions, we end up with lives saved or sacrificed by the contentious bias of programmers, and/or random number generators.

  • Like 1

Share this post


Link to post
Share on other sites

@SpudMuffin ... I guess you'll be doing a lot of walking and public transport then ? ... but what if the accident didn't kill those in the car or the pedestrians ? There were hospital bills for those in the vehicle though ... and resultant mental trauma  ...

@Rybags ... " That's part of the reason they'll never become mainstream.  The legalities, and the fact that nobody has ever programmed an infallible complex system. "...

Well the above didn't stop us putting the first motors on the road and allowing those that could afford them to drive them without licence did it ? Yeah I know we've come some distance since the horse and buggy, but ...

@~thehung  ...( couldn't highlight you, curious that ). ..." driverless cars don't kill people!  programmers kill people, with driverless cars!  "...

guns don't kill people! makers of guns kill people, with guns! ... looks like we gonna have automated cars on masse.

Maybe the sales pitch for the various motors will be the style of program. I bet the best seller is the one with the program that is set to always save the person/s in the vehicle first. Which is going to play havic with all of our lores and laws currently espoused.
Until of course some one thinks that maybe that's not too great ethically ... and then what happens ? Will people buy a machine that might kill them on their way to work ?

@TheManFromPOST ...I think the sales pitch is that the automated car will stem the road toll ? .... 'course this does not allow for the possibility a hacker might do ... or even someone doing the programming distracted
by their loved one's death due to an automated car or < shrugs > just not having had enough sleep ...

Edited by eveln
just thought I'd have another go re linking ... managed TMFP but not @~thung ... oh well

Share this post


Link to post
Share on other sites

Jokes aside, the occupant is at fault.  It's *your* car - and *your* responsibility for what it does.  Also, the law as currently written says you are supposed to be the one in control, and the autopilot is assistance.

Share this post


Link to post
Share on other sites

The other possibility is that we'll experience the other end of the scale.

Roads full of paranoid androids with traffic jams that last for hours and sick occupants of vehicles that start/stop constantly and don't maintain a steady speed for more than a few seconds at a time.

  • Like 1

Share this post


Link to post
Share on other sites

Yeah @Cybes I reckon the laws will have to change though eventually, and I'm hoping I won't have to be impacted by them myself. But don't think they will change till the vehicle does not have human override function.

Roads full of traffic jams due to program malfunctions due to individual company competitiveness ... you know, like random interference or sumfink

Share this post


Link to post
Share on other sites

Why must there be guilt? Why must there be blame apportioned?

The lawyers have want you to focus on that so they can apportion payment of costs, rather than society accepting a tragedy and working to avoid it. Call it an act of god, or for those of us who are atheists, perhaps we can feel more at ease with a 'statistical anomaly'

Risk management - understanding, assessing, planning and mitigating reduce the chances.

There are a bunch of technologies and ethics people who believe that self driving cars will legally require a certification framework. If it's certified, no-one is to blame, and insurance companies have to accept that. Personally, I don't think they have thought throught the commertcial realities of that. People won't pay for the engineering rigor required for every car to be certified to those standards.

On averages, I trust computers to drive cars much more than people. Under correct operating circumstances* they never get tired, distracted, drunk/drugged etc, and they have a good reaction time. They will still came across unpredictable configurations and potentially fail to act correctly resulting a a fatality, but I believe the road toll would be at least an order of magnitude lower, and the general productivity of society improved, so it's a net gain.

*'correct operating circumstances' is the issue: software/systems that is hacked/cracked, willingly modded, malware etc. Lets face it, we barely can avoid the internet going to hell in a handbasket when it comes to cyber security. I don't trust the car industry to build a secure computing stack. And to be viable they would be locking the car computers and systems down to a level that would make Apple hardware and IOS look like open source nirvana. Build, supply, maintenance of cars, would end up similar to aircraft systems. Expect very slow changes, stifled innovation and excessive costs for all variants, and maintenance costs from licenced service personnel for regular alignment/calibration and safety checks - lets face it, we can't even use an extension lead these days without an electrician tagging it every year. Imaging the requirements for the radars and camera systems to be fitted to a car.

It will only work when personal/individual transport is a service - You don't own the car, a company manages a fleet of self-driving ubers or equivalent. and every car is guaranteed to be tested, calibrated, and recertified. The the insurance companies and lawyers will find a way to work. Under that self-driving utopia motoring enthusiasts who want to drive their car will be limited to tracks.

And then it all collapses when you realise that people still need to drive cars in country areas, off road, etc, and the model falls over. You can't practically implement both configurations with 80% automated, 20% human driven and expect to get 80% of the gains. And model would depend on achieving all of the gains to justify almost any of the effort.

It needs the death of motoring as a hobby. That's 2-3 generations from now with riding in ubers with transport considered a service. Eliminating the joy of driving and replacing it with a utilitarian requirement, at which point you might be able to get to 99$ automated driving and achieve the goals. More-likely in that time (100 years) that you would eliminate personal transport by larger society changes. Lets face it we have only had cars being ubiquitious for 3-4 generations (<100 years)

(wow that ended up being a long stream of conciousness)

  • Like 1

Share this post


Link to post
Share on other sites
1 hour ago, stadl said:

Why must there be guilt? Why must there be blame apportioned?

There's been an incident between a vehicle and pedestrians ... people are dead/hurt ... we need to apportion blame to maintain the human as important in the grand scheme of things. If we don't, then the "tragedy" risks becoming  less than a blimp factor  on societies radar of need to care

1 hour ago, stadl said:

The lawyers have want you to focus on that so they can apportion payment of costs, ...

Well yess business and money ( how ever it looks ) makes the world go round.

Even though I risk death every time I drive my car, I'm awfully glad of the long range forecast of the automated car being in my drive way

Share this post


Link to post
Share on other sites
3 hours ago, Cybes said:

Jokes aside, the occupant is at fault.  It's *your* car - and *your* responsibility for what it does.  Also, the law as currently written says you are supposed to be the one in control, and the autopilot is assistance.

But the cars they are envisioning don't have manual controls.  They're talking about using them as taxi services etc so you don't even own the car you are in. 

I think that as long as the car has been serviced and hasn't been modified in any way, it has to go back to the company who makes the car and/or supplies the software.

Share this post


Link to post
Share on other sites
2 hours ago, stadl said:

I believe the road toll would be at least an order of magnitude lower

That's a very safe bet, imho - even the incomplete v1.0b versions available atm are vastly safer than human drivers.  But there will always be idiots who think "not perfect" means "unacceptable", even when it's *way* superior to the current situation.*

(*eg: At one point, the US Army were looking for a replacement for the venerable 1911 semiauto.  After 6 months of trials, *none* of the entrants had passed all of the tests - because at least one of them was wholly unreasonable.  So they stuck with their 1911s.  This despite the fact that *all* of the other subjects had outperformed the 1911 in *every*single*test*.  They could literally have chosen the worst possible replacement option and still have improved their standard sidearm!)

 

55 minutes ago, fliptopia said:

But the cars they are envisioning don't have manual controls.  They're talking about using them as taxi services etc so you don't even own the car you are in. 

I think that as long as the car has been serviced and hasn't been modified in any way, it has to go back to the company who makes the car and/or supplies the software.

OP specified "your automated car".  That's not even an implication - it's outright statement.  Anything owned passes responsibility to its owner - whether the owner can then shunt that on to their software or hardware provider is a separate issue.

Share this post


Link to post
Share on other sites

i reckon the solution is to make an ap that lets you drive the car, and so people can stare at their mobile while driving because it appears to be coming commonplace

Share this post


Link to post
Share on other sites
7 hours ago, scruffy1 said:

i reckon the solution is to make an ap that lets you drive the car, and so people can stare at their mobile while driving because it appears to be coming commonplace

edit: i should add: actual collision is NOT shown.

Edited by @~thehung

Share this post


Link to post
Share on other sites

those ^^ people  should not be allowed to leave their home. Neither the one in the vehicle or the one crossing the road with the pushbike.

Share this post


Link to post
Share on other sites

I guess also there is a difference between 'guilty'and 'responsible'for these kids of situations. If you're the sole occupant of a automated car and someone else dies, you may not be 'guilty'as you were not in control but where you 'responsible'?

 

Who would be 'liable'', the occupant (if deemed to be otherwise 'at fault') or the developers of the system through their action, inaction or ignorance of the failing?

Share this post


Link to post
Share on other sites
7 hours ago, SpudMuffin said:

 

Who would be 'liable'', the occupant (if deemed to be otherwise 'at fault') or the developers of the system through their action, inaction or ignorance of the failing?

I'm of a mind with Cybes here, the owner / person in the "drivers" seat, of the vehicle is legally responsible. They could try to sue the developers later if they,  the owner, feels they bought a faulty program ...

... but if the car has no human override function I'd say the developers would be legally to blame

Share this post


Link to post
Share on other sites

This is a beloved ethical question, but it's actually not a realistic one.

Who is responsible if a driverless car runs over and kills a person? Definitely the company that made it will be found liable, for one specific reason: Driving faster than your braking distance is a strictly human problem.

Self-driving cars _will not_drive faster than their confidence speed. The confidence speed will be the speed at which the car is certain that no pedestrian it can currently see can manage to jump in front of it before it can stop, and no pedestrian who might be hiding in a shrub out of LIDAR view could jump in front of the car.

Critically, that means that whenin CBD's, down streets with lots of hedges, going around blind corners and in any other situation where visibility is compromised, self driving cars are just not going to exceed 15km/h, which is around the speed that a car with a reaction time of almost zero can confidently drop below injury speed from at short notice.

That sounds like it'll make self-driving cars uncompetitive with human-controlled cars, but there's a few mitigating factors.

1. Insurance on human cars will go through the god damn roof
2. Self driving cars will eventually (and it'll take years from when the first self-driving cars hit the road) start sharing networked sensor information to help mitigate the problems with blind cornders
3. Self-driving cars will /really quickly/ start sharing map data about areas with shit visibility or lots of pedestrians and avoid them where possible
4. You won't care if your commute through the CBD takes 15 minutes longer if you can be eating breakfast and reading the news while it happens
5. Self-driving cars can all accelerate as one block when the traffic lights go green, and you get about four as much traffic (IIRC) through each light change if you can do that, which means what self driving cars lose while driving near pedestrian zones they make up in spades during light changes

But yeah, this notion that self-driving cars might end up in a situation where they can't stop in time and have to choose someone to kill? Humans suffer from impatience, robots do not. Robots just won't ever drive fast enough to allow that situation to be possible.

Share this post


Link to post
Share on other sites
1 hour ago, Sir_Substance said:

this notion that self-driving cars might end up in a situation where they can't stop in time and have to choose someone to kill?

Surprise happens.  If the road surface isn't what you thought it was, or some drongo sticks his head out of a manhole you didn't know was there.  Yes, edge cases, but edge cases are the topic at hand - you don't need to decide who has to die as a matter of course.

Share this post


Link to post
Share on other sites
5 hours ago, Sir_Substance said:

This is a beloved ethical question, but it's actually not a realistic one.

Who is responsible if a driverless car runs over and kills a person? Definitely the company that made it will be found liable, for one specific reason: Driving faster than your braking distance is a strictly human problem.

Self-driving cars _will not_drive faster than their confidence speed. The confidence speed will be the speed at which the car is certain that no pedestrian it can currently see can manage to jump in front of it before it can stop, and no pedestrian who might be hiding in a shrub out of LIDAR view could jump in front of the car.


you seem to be exhibiting some of the overly-reductive hubris that is at issue here.

your assumptions about confidence speed being a magical catch-all are faulty.  the accuracy of the systems used to define and measure the required level of "confidence" are prone to potential failure.  

in the Uber case i posted above, Uber had willfully disengaged the Volvo’s factory settings for automatic emergency braking and collision avoidance.  and evidently this Tesla car was operating with a suspect level of confidence when it killed its occupant:

tesla-crash-police-report-diagram-superJ

A diagram from the police report about the Tesla crash shows how the vehicle in self-driving mode (V02) struck a tractor-trailer (V01) as it was turning left. link

according to Tesla "neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky".  we will never know for sure if the same human, or a 'reasonable person', would have spotted the trailer were they not chaperoning a robot at the time.  

this incident likely exposed shortfalls in the sensor system.  or perhaps those systems were operating faultlessly at levels of acuity far beyond human capability.   in either case, though, it would still be arguable that the driving speed was overly confident.  

acceptable risk is not something that is trivial to codify.   when you factor out incompetence, we humans routinely exceed our personal confidence speed.  if we did not, traffic would grind to a crawl.  a cyclist flying out of some bushes is one thing, while a 100km/h red-light runner at an intersection with poor visibility on the approach is quite another.   to a large extent we take on acceptable risks we define for ourselves as individuals.  if driverless cars are to function amongst human drivers, there is a practical need for them to drive at speeds not optimally safe — by design.  the designers must quantify how unsafe is safe enough.

there will always be eventualities that neither man nor machine can anticipate or prevent.  but if the above incident was indeed preventable, its a shame you werent consulted beforehand.  you would have said "It's simple, Elon, just set the confidence speed correctly and it's all good man", collected your fee, and that man would still be alive today.

Share this post


Link to post
Share on other sites

Given the crap driving I see  everyday I guess it could only be an improvement, I wonder if driverless cars will have a horn  ?

I had a classic today, was honked by a moron whilst turning into my  local shopping center. Figured  ok, maybe I have a blown indicator globe But he pulled  up along side me so I asked him.

No, he thought I could have turned before the  oncoming traffic...

"Listen shitbrain, I make  the driving decisions in my car, not you, fuck off. "

He went to get agro on that, then realised he was significantly shorter than me, and way below my looked like fitness. (I'm not, ebbing disability but dealing with a person trying to attack you whilst getting out of a car, is painful, for the other person, not me.)

Reminded him it was double demerits weekend, we were in a 50 kmh zone and wished him well on keeping his license.

Last laugh, we were both going to the same bottle shop - owner is a friend of mine, no-nonsense South African - refused to serve him because he had behaved like a moron and might be intoxicated ?

I agree with stadl,  I like driving myself, I don't like sharing the roads with idiots...

Cheers

 

 

 

Share this post


Link to post
Share on other sites
4 hours ago, @~thehung said:


a cyclist flying out of some bushes is one thing, while a 100km/h red-light runner at an intersection with poor visibility on the approach is quite another.   to a large extent we take on acceptable risks we define for ourselves as individuals.  if driverless cars are to function amongst human drivers, there is a practical need for them to drive at speeds not optimally safe — by design.  the designers must quantify how unsafe is safe enough.

That's where we disagree, I don't see that practical need. Self-driving cars are going to push out human drivers, so it doesn't matter if the human drivers feel the traffic is going "too slow". Insurance companies won't want to cover humans who might run red lights at 100km/h when they could cover robots that never do, and so increasing insurance costs will marginalise humans. On top of that, the ability to time-share a car between 3+ people and have it autonomously route between them will rapidly speed the adoption of self-driving cars once they become available, with one obvious result:

Self-driving cars will form "scar tissue" around human drivers on main roads to prevent them from messing with the traffic flow. When you drive in a traditional car, you'll get gently and tactfully boxed in by the three nearest self-driving cars, they will watch your movements and indicators and move with you to let you get where you indicate you want to go, but they will body-block you from running a red light at 100km/h or even getting to 100km/h, so that other cars can be more certain of your behavior. And yeah, that means those self-driving cars are going to force you to drive at 15km/h if they're not sure you can drive faster without causing an accident.

Share this post


Link to post
Share on other sites
1 hour ago, Sir_Substance said:

When you drive in a traditional car, you'll get gently and tactfully boxed in by the three nearest self-driving cars, they will watch your movements and indicators and move with you to let you get where you indicate you want to go, but they will body-block you from running a red light at 100km/h or even getting to 100km/h, so that other cars can be more certain of your behavior. And yeah, that means those self-driving cars are going to force you to drive at 15km/h if they're not sure you can drive faster without causing an accident.

Wow! Really ? ... That would seriously curb a motor bike or even a pushbike too . I'm talking about the bikes ( motor and pedal ) that like to get around and through the traffic

Edited by eveln
the usual

Share this post


Link to post
Share on other sites
8 hours ago, eveln said:

Wow! Really ? ... That would seriously curb a motor bike or even a pushbike too . I'm talking about the bikes ( motor and pedal ) that like to get around and through the traffic

I certainly consider it to be the logical development once there's a critical mass of self-driving cars on main roads. It'll be much cheaper than re-designating some roads as self-driving only, so it'll get government support provided it can be done accurately enough. Yes, I think it'll happen to motorbikes They're not actually allowed to sneak up the middle of lanes in most places anyway, and they'll slow down all the traffic behind them at the lights because they won't be able to accelerate in a single unified block with the 50 networked self-driving cars behind them.

Cyclists are harder to deal with because they're so much slower than cars, and not as well equipped. It's pretty easy for car manufacturers to program the scar tissue planner so that if a human-driven car doesn't indicate, it won't be given a slot to cross traffic, which would enforce good behavior on human drivers. You could *try* that with arm-signalling with bikes but it's a harder computer vision problem to solve and not everyone has two arms. I don't have a good prediction for that one, other then probably self-driving cars avoiding lanes with cyclists in them entirely.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×