10 comments

  • RobotToaster18 小时前
    "A computer can never be held accountable, therefore a computer must never make a management decision" - IBM 1979
    • rqtwteye17 小时前
      2030: "Nothing is our fault because the computer did it. We get the profits though."
  • gweinberg17 小时前
    Has "the AI did it" ever been a valid defense? I can see why the legislature might want to make it explicit that it is not, but surely it can't be the case that harm caused by an AI is treated as if it were an act of God.
    • malfist16 小时前
      Not yet probably, but "too expensive to comply" is a common refrain from big tech when they break the law. Look at all the excuses Uber has made
    • bean-weevil11 小时前
      I agree. If you hire someone, and they aggrieve someone in the course of their duties, your contractor is your agent and you're responsible for what they did. It makes no sense to treat AI any differently.
    • kjs310 小时前
      While it's not been wrung out in the courts yet, the insurance industry has been denying coverage based on "the AI said so" for a while, coupled tightly with "we can't tell you why because the algorithm is a trade secret". Given the current governance and oversight climate, I don't have much hope that more transparency is in the offing.
  • gibbitz17 小时前
    Thank God, civil lawsuits can save us from AI overuse since the government has reversed their position.
  • alexd12718 小时前
    How would user vs developer be distinguished? Looks like it mentions both users and developers but doesnt distinguish how one would be found liable
    • rileymat218 小时前
      > This bill would prohibit a defendant that developed or used artificial intelligence, as defined, from asserting a defense that the artificial intelligence autonomously caused the harm to the plaintiff.

      I am not sure the distinction matters?

      • dzink17 小时前
        If you’re driving a self-driving car, you can’t blame it on the car.
        • Muromec17 小时前
          You can blame it on a car even if the car is not self-driving
          • Terr_17 小时前
            I think there's a potential confusion here between different kinds of "blame."

            If your robot punches a bystander, you're liable for their hospital bills, separately from whether you're culpable for battery.

            • kjs310 小时前
              Perhaps you can cite when this was tested in court, because "robot punches man" isn't something I've noted in the headlines. Otherwise, it's nothing but speculation.
          • Retric17 小时前
            It’s your responsibility to ensure that your breaks etc work.

            Presumably it’s also your responsibility to pick and maintain a working self driving system.

            • cgriswald17 小时前
              Vehicles have defects and sometimes the manufacturer is at fault for accidents.

              The upstream poster is correct. This new law has no relevance to who is liable. It would simply remove "the AI did it" as a valid defense in any case (with whatever exceptions are defined in the existing law referenced).

              • Retric17 小时前
                Like the resale of stolen property. The victim can always just sue you, you may be able to sue the person you got it from.
                • cgriswald17 小时前
                  You can always be sued, and this may be a difficult defense to make in practice, but drivers are not generally liable for accidents caused by mechanical failure if the cause is manufacturing defect.
                  • Retric17 小时前
                    Sure, if you can demonstrate that there’s no reasonable way the fault could have been predicted or avoided then that can get you off the hook.

                    However if your tire blows out you’d be expected to demonstrate that you regularly inspect them for ware or damage and there hasn’t been a recall etc. That same level of proactive care is going to be applied to self driving systems.

                    • cgriswald16 小时前
                      I don’t see anyone really disputing that. To the extent I’ve participated in this thread it has been to make clear:

                      1. Both car manufacturers and car drivers can be liable, even with self-driving cars. Any confusion here is likely due to conflating the car with the car manufacturer.

                      2. The proposed law wouldn’t assign liability, it would simply remove “the AI did it” as a possible legal defense.

                      • Retric15 小时前
                        What I think people are ignoring is choosing to be an early adopter of fully autonomous self driving vehicles is itself going to be questioned.

                        Being the first member of the general public to use a 100% self driving car the first day it’s available might even be considered reckless if it then crashes that day.

                        Later of is a model is preforming poorly operating such a vehicle could be called into question etc.

            • Muromec13 小时前
              >Presumably it’s also your responsibility to pick and maintain a working self driving system.

              I guess it's reasonable to say I should apply updates and get the car serviced and make sure the sensors are not obstructed, yes. Then the difference between "self driving" and "assisting" technology would be a matter of guarantee the manufacturer advertises.

              Would it (hypothetically) be reasonable for me to expect that the thing will suddenly break on a highway in normal conditions without anything obstructing the way forward? No, I don't think so. Can I do something to prevent it or foresee it, except not using the self-driving technology at all? Can I choose to have a different self-driving tech installed in my car and retrain the model or control it's behavior in any way at all?

              There are different answers to that and I guess the answers will also change over time.

            • echoangle17 小时前
              I don’t think it’s that binary. You have some duty to make an effort of maintenance but there can still be accidents that are just technical faults no one is really responsible for.
              • Retric17 小时前
                Good luck arguing that to a jury.

                It’s true in some cases the manufacturer or car mechanic etc is at fault rather than the owner, but it’s difficult to offload responsibility to a 3rd party.

                • jdietrich16 小时前
                  People (or rather their insurers) successfully argue that to a jury all the time. If your vehicle is serviced by a professional mechanic according to the manufacturer's recommendations, it's very difficult to argue that you're liable for the consequences if your brakes suddenly fail. You took all reasonable steps to ensure that your vehicle was in a roadworthy condition. If you didn't bother to follow the manufacturer's recommendations, then you're on your own.
                  • Retric15 小时前
                    You’re making this seem easier than it is. Even just convincing people the brakes actually failed is a hurdle.

                    Someone trying to defend themselves by saying the brakes failed needs to show the brakes alone are the cause of the accident rather just a contributing factor. So there was no alternative like using a parking brake and the driver didn’t get into an unsafe situation.

                    Similarly the failure must be sudden and not predictable etc.

            • DerekL16 小时前
              I think you meant “brakes”.
        • techjamie17 小时前
          That wouldn't work anyway. In basically every jurisdiction, operators of vehicles are expected to retain control of the vehicle regardless of self driving status.
          • bobthepanda17 小时前
            I think it’s useful for legislators to do their job and explicitly define and clarify the boundaries of the law. You can’t necessarily just rely on precedent for new things.
    • Kinrany18 小时前
      Not a new question: "the AI did it" is an equivalent of "the gun did it" or "the car did it".
  • TheRealPomax17 小时前
    It's unclear whether this applies to state law enforcement or only the citizenship. It also seems to focus not on "invalidating the defense" but on "punishing the developers" which is more than a little weird.
    • inetknght16 小时前
      > It's unclear whether this applies to state law enforcement or only the citizenship

      A few months ago I might have pointed out that nobody is above the law.

      Alas, you now have a damn good point.

  • SpicyLemonZest16 小时前
    The "AI did it" defense as defined in this bill seems like it clearly ought to be valid. If a CNC machine injures someone, and the victim sues the operator, I don't see why he should be unable to argue that it wasn't his fault because the machine malfunctioned.
  • breadwinner17 小时前
    The bill is specifically about cars, but the site doesn't make that clear. Here's a better link: https://calmatters.digitaldemocracy.org/bills/ca_202320240ab...
  • nine_zeros18 小时前
    I hope when they mean AI developers, it means the owners of the IP, and not the actual devs working on it.
    • TZubiri15 小时前
      As I understand it the concept of causal chain is quite standard in liability law
  • deadbabe18 小时前
    This bill sucks, a developer should never be held accountable for software they work on or produce.
    • IX-10312 小时前
      Professional engineers are held liable for mistakes in their work. If a house falls or a dam breaks the responsibility falls on them to show it wasn't built as designed or there was some other factor involved that a professional engineer would not be expected to have considered in their design.

      And, yes, as others have mentioned they get insurance, but there's more to it than that. The level of verification that those designs go through before being released is on a completely different level than what is usually applied to software.

    • roughly17 小时前
      That's an insane standard. Your work has real-world consequences, act accordingly.
      • codr717 小时前
        Hm, once we draw that line a shit ton of people will have to start asking difficult questions about their work.

        Not saying it's a bad idea, but it has consequences that I suspect you have not considered.

        It's very rare for the people writing the code to have any say in what it does.

        • otterley15 小时前
          > It's very rare for the people writing the code to have any say in what it does.

          That's no defense. Employment is voluntary, and you cannot break the law while doing your job, even if your employer commands you to.

          • TZubiri15 小时前
            It's also a separate issue. One thing is whether we are responsible for our work,and another is whether employers or employees are responsible.

            In the second case responsibility doesn't vanish, it is transferred.

            • roughly12 小时前
              If your argument is that employees are not responsible for the results of their work - not legally accountable, but morally responsible - I strongly disagree with this. You are always morally responsible for the results of your actions. Always. There may be mitigating circumstances - you need health care for your child’s chronic medical condition, you were faced with an impossible choice and chose the lesser evil, someone literally held a gun to your head - in which case the full moral picture may absolve you, but you still performed the actions that led to the outcomes in question, and you are still responsible for your actions and their consequences.
              • TZubiri11 小时前
                That is yet a third separate question. Let's recap:

                1- Are software developers (whether a company as a whole, or an indepent developer) responsible for the software they develop?

                2- When a company develops software, is the responsibility on the company owners, the managers, or the employees or all in shared manner?

                3- What is the moral responsibility and what is the legal responsibility in what way are they similar in what way are they different?

                At this point I'd forego the hopes of getting an answer and just focus on trying to make clear questions.

                • roughly10 小时前
                  You’re putting in a lot of effort to avoid actually answering any of these questions. I’m sorry if your work leads you to not want those answers.

                  You bear moral responsibility for the outcomes of your actions, whether you do them on your own time or while acting as the agent of a company. You do not get to “transfer” moral responsibility. Soldiers are responsible for their conduct, employees for theirs.

                  And, the overlap between legal and moral responsibility is that legal responsibility sometimes aligns with moral responsibility, but that’s irrelevant for the discussion of the moral.

                  And to cut off the obvious question of “whose morality” - yours. You are accountable to whatever moral system you prefer to live by, but you are always accountable for your actions and their consequences, even when that’s uncomfortable or you’d prefer not to be. We don’t get days off from being human.

          • codr711 小时前
            So if I write say the ORM code for a system that causes some kind of (completely unknown to me) harm, I'm guilty.

            Sorry, that doesn't compute for me.

            • otterley10 小时前
              No, but if your employer asks you to write software specifically designed to defraud a financial institution, yes, you are breaking the law.
        • roughly15 小时前
          Legally liable is one thing, but if you contribute to a piece of software that has real-world consequences - your trading algorithm generated the money used to fund an election, your guidance software was used to bomb civilians, your algorithmic tweaks led to vaccine denialism becoming widespread - that's consequences of your actions. You can dress it up any way you want - you've got bills to pay, you were ordered to do so, you didn't see it coming, you can't control how people use your software - but at the end of the day, you contributed to those outcomes.
      • lesuorac17 小时前
        Eh, I'm not even sure why the developer would ever be held accountable for developing things as required. It's not like a machinist is ever held responsible for a firearm death or a barrel maker is held responsible for an alcohol death.

        Making tools is just not a liable line of work. Using the tools or selling the tools historically is. _Brewers_ not barrel makers have been sued for alcohol deaths.

        However, if you were to add a backdoor to some software then yeah that's just general crime.

        • otterley15 小时前
          > Making tools is just not a liable line of work.

          The law says otherwise. https://www.law.cornell.edu/wex/products_liability

          • lesuorac13 小时前
            If I sell you a hammer and the hammer breaks sure I'm liable.

            If I sell you a hammer and you hammer in somebody's head, I'm not liable.

            The second scenario is what we're actually talking about for developers. If I write VoIP code and you use it to commit ransomware; nobody is taking a developer to court over it. At best corporate is going to court.

            Even in the first scenario, you can't sue the individual worker that produced the hammer. You're limited to getting a refund or replacement of the hammer from corporate.

    • tylerchilds17 小时前
      My reading of this is that it will be tied to owners, not necessarily authors.

      Today, when code does illegal things, the company gets sued.

      This applies that to AI as well, which is that employers are liable, not that they can pin it on the dev that actually made the implementation.

      You’re still incentivized to not get your employer sued, since you’ll get fired, same as today.

      This just removes the “a robot did it” immunity. For self driving cars, this is a win, since if you die due to waymo, your literal death concludes as a hypothetical as it currently stands.

    • rileymat217 小时前
      But then won't you get a bunch of things that have tons of fine print that no one reads that say do not use it for these purposes. Then have users use them for those purposes harming others?
    • Muromec17 小时前
      You just get an insurance, like every other reputable professional does.
    • unsnap_biceps17 小时前
      Wouldn't that broad statement also apply to malware or other malicious software development practices?
    • 17 小时前
      undefined
    • TZubiri15 小时前
      Lol, found the crazy scientist
    • Cheer217116 小时前
      yeah, what do you think this field is called, software engineering?
  • doctorpangloss17 小时前
    Ha ha, can’t say AI did it, but hundreds of people die yearly in California to drivers claiming pEdAl cOnFuSiOn, and that’s okay.
    • grepfru_it17 小时前
      It’s not okay though, what you described is still fault of the driver
      • doctorpangloss17 小时前
        Prepare to be pissed off when you punch this stuff into Google and study this issue.
        • grepfru_it13 小时前
          Google provides me a list of California law firms willing to defend against a pedal confusion liability claim. Maybe I’m missing something?
          • doctorpangloss13 小时前
            You're supposed to add "reddit" to Google searches to make them work well.

            The punchline is that you can kill somebody in California using a car, claim pedal confusion, and all else being equal, you will face no criminal consequences. You will almost never be financially ruined either. The point of this comparison is that California assembly members write a lot of bills, but they rarely seem to find bandwidth for addressing any of its numerous crises of direct responsibility. It's always some fashionable bullshit.