76 points | by opawlowski16 hours ago
I am not sure the distinction matters?
If your robot punches a bystander, you're liable for their hospital bills, separately from whether you're culpable for battery.
Presumably it’s also your responsibility to pick and maintain a working self driving system.
The upstream poster is correct. This new law has no relevance to who is liable. It would simply remove "the AI did it" as a valid defense in any case (with whatever exceptions are defined in the existing law referenced).
However if your tire blows out you’d be expected to demonstrate that you regularly inspect them for ware or damage and there hasn’t been a recall etc. That same level of proactive care is going to be applied to self driving systems.
1. Both car manufacturers and car drivers can be liable, even with self-driving cars. Any confusion here is likely due to conflating the car with the car manufacturer.
2. The proposed law wouldn’t assign liability, it would simply remove “the AI did it” as a possible legal defense.
Being the first member of the general public to use a 100% self driving car the first day it’s available might even be considered reckless if it then crashes that day.
Later of is a model is preforming poorly operating such a vehicle could be called into question etc.
I guess it's reasonable to say I should apply updates and get the car serviced and make sure the sensors are not obstructed, yes. Then the difference between "self driving" and "assisting" technology would be a matter of guarantee the manufacturer advertises.
Would it (hypothetically) be reasonable for me to expect that the thing will suddenly break on a highway in normal conditions without anything obstructing the way forward? No, I don't think so. Can I do something to prevent it or foresee it, except not using the self-driving technology at all? Can I choose to have a different self-driving tech installed in my car and retrain the model or control it's behavior in any way at all?
There are different answers to that and I guess the answers will also change over time.
It’s true in some cases the manufacturer or car mechanic etc is at fault rather than the owner, but it’s difficult to offload responsibility to a 3rd party.
Someone trying to defend themselves by saying the brakes failed needs to show the brakes alone are the cause of the accident rather just a contributing factor. So there was no alternative like using a parking brake and the driver didn’t get into an unsafe situation.
Similarly the failure must be sudden and not predictable etc.
Tesla gives you the old with the new.
A few months ago I might have pointed out that nobody is above the law.
Alas, you now have a damn good point.
And, yes, as others have mentioned they get insurance, but there's more to it than that. The level of verification that those designs go through before being released is on a completely different level than what is usually applied to software.
Not saying it's a bad idea, but it has consequences that I suspect you have not considered.
It's very rare for the people writing the code to have any say in what it does.
That's no defense. Employment is voluntary, and you cannot break the law while doing your job, even if your employer commands you to.
In the second case responsibility doesn't vanish, it is transferred.
1- Are software developers (whether a company as a whole, or an indepent developer) responsible for the software they develop?
2- When a company develops software, is the responsibility on the company owners, the managers, or the employees or all in shared manner?
3- What is the moral responsibility and what is the legal responsibility in what way are they similar in what way are they different?
At this point I'd forego the hopes of getting an answer and just focus on trying to make clear questions.
You bear moral responsibility for the outcomes of your actions, whether you do them on your own time or while acting as the agent of a company. You do not get to “transfer” moral responsibility. Soldiers are responsible for their conduct, employees for theirs.
And, the overlap between legal and moral responsibility is that legal responsibility sometimes aligns with moral responsibility, but that’s irrelevant for the discussion of the moral.
And to cut off the obvious question of “whose morality” - yours. You are accountable to whatever moral system you prefer to live by, but you are always accountable for your actions and their consequences, even when that’s uncomfortable or you’d prefer not to be. We don’t get days off from being human.
Sorry, that doesn't compute for me.
Making tools is just not a liable line of work. Using the tools or selling the tools historically is. _Brewers_ not barrel makers have been sued for alcohol deaths.
However, if you were to add a backdoor to some software then yeah that's just general crime.
The law says otherwise. https://www.law.cornell.edu/wex/products_liability
If I sell you a hammer and you hammer in somebody's head, I'm not liable.
The second scenario is what we're actually talking about for developers. If I write VoIP code and you use it to commit ransomware; nobody is taking a developer to court over it. At best corporate is going to court.
Even in the first scenario, you can't sue the individual worker that produced the hammer. You're limited to getting a refund or replacement of the hammer from corporate.
Today, when code does illegal things, the company gets sued.
This applies that to AI as well, which is that employers are liable, not that they can pin it on the dev that actually made the implementation.
You’re still incentivized to not get your employer sued, since you’ll get fired, same as today.
This just removes the “a robot did it” immunity. For self driving cars, this is a win, since if you die due to waymo, your literal death concludes as a hypothetical as it currently stands.
The punchline is that you can kill somebody in California using a car, claim pedal confusion, and all else being equal, you will face no criminal consequences. You will almost never be financially ruined either. The point of this comparison is that California assembly members write a lot of bills, but they rarely seem to find bandwidth for addressing any of its numerous crises of direct responsibility. It's always some fashionable bullshit.