• Maiq@lemy.lol
    link
    fedilink
    arrow-up
    19
    arrow-down
    2
    ·
    7 days ago
    1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

    2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

    3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

    • xia@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 days ago

      Could you imagine an artifical mind actually trying to obey these? You can’t even get past #1 without being aware of the infinite number of things you could do cartesian-producted with all the consequential downstream effects of those actions until the end of time.

    • Archangel@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      6 days ago

      No one ever explained why they had to obey those laws, in the 1st place…only that they had to.