• architect@thelemmy.club
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    3
    ·
    1 day ago

    I can’t be the only one that thinks if you do stupid illegal shit that your crazy uncle told you/voices in your head told you/AI mirror told you you don’t get to use the excuse that you were just following orders from any of those options.

    • Snowclone@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      22 hours ago

      That’s not the problem. the problem is having a “lets turn Chris’ mental illness that’s harmed no one so far, into everyone’s violent problem!” machine.

      that’s a bad machine.

    • dream_weasel@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      19 hours ago

      The difference is when a LLM tells you, it’s news.

      Besides, what are you gonna do if you ask AI how many rocks to eat? NOT eat rocks? People can’t handle responsibility like that.

    • moonshadow@slrpnk.net
      link
      fedilink
      English
      arrow-up
      2
      ·
      24 hours ago

      Power imbalance is what validates that excuse. Orders from crazy uncle is a great excuse, at least until you’re 10 or so. Billion+ dollar llm company has a lot more resources, capability, and therefore responsibility than the poor bastards engaged with it

      • Hazor@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        18 hours ago

        Not just suicide assistance chat bots, but suicide promotion chat bots.