• Thorry@feddit.org
    link
    fedilink
    arrow-up
    11
    ·
    9 days ago

    I’m pretty sure it isn’t that easy to do what that dude did. It’s a multi step process. It doesn’t say: “This will delete your data, are you sure you want to continue?”, but it also isn’t like he clicked on the x top right and all of the data was gone. The language of the function is also pretty clear and there are a lot of ways to find out what it does. The dude even admits himself he wanted to know if he could toggle that and still have access to his data, but instead of asking the chatbot beforehand he just tried it and then cried foul when it actually locked him out.

    • ZDL@lazysoci.al
      link
      fedilink
      arrow-up
      4
      ·
      9 days ago

      It really should, however, if you’re making software for people to use, point out the consequences, especially if the consequences aren’t immediately obvious to the requested action. There’s a sizable divide between “don’t share my data” and “OK, we’ll delete everything”.

      Don’t get me wrong. ChatGPT is a festering pile of shit even without this. This “professor” should be stripped of his teaching credentials and be thrown into an LLMbecile detox centre, only allowed to exit when he learns to think for himself. These are both true.

      But it is also true that if there are drastic implications to an action that isn’t an obvious outcome from the requested function it should warn you.