• Lemminary@lemmy.world
    link
    fedilink
    arrow-up
    20
    ·
    3 days ago

    Our AI that monitors customer interactions sometimes makes up shit that didn’t happen during the call. Any agent smart enough could probably fool it into giving the wrong summary with the right key words. I only caught on when I started reading the logs carefully, but I don’t know if management cares so long as the business client is happy.

    • jj4211@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      2 days ago

      Sounds like material that is generated that the executives demand be generated but never actually uses. My work has a ton of this, because the executives want people to feel like they are accountable and being reviewed even as they know the executives don’t understand the direct output of their work, so people have to do the technical thing and separately eternally do non-technical writeups of what the technical work meant. I think someone checked and the executives didn’t even log into the system they demanded.

      So LLM to generate the bullshit that no one wants to write or read but wants to pretend it’s important.