• pseudo@jlai.lu
    link
    fedilink
    arrow-up
    51
    ·
    4 days ago

    When you delegate, to a person, a tool or a process, you check the result. You make sure that the delegated tasks get done and correctly and that the results are what is expected.

    Finding that it is not the case after months by luck shows incompetence. Look for the incompetent.

    • flying_sheep@lemmy.ml
      link
      fedilink
      arrow-up
      13
      ·
      4 days ago

      Yeah. Trust is also a thing, like if you delegate to a person that you’ve seen getting the job done multiple times before, you won’t check as closely.

      But this person asked to verify and was told not to. Insane.

    • Tja@programming.dev
      link
      fedilink
      arrow-up
      7
      ·
      4 days ago

      100%

      Hallucinations are widely known, this is a collective failure of the whole chain of leadership.

    • jj4211@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      4 days ago

      Problem being is that whoever is checking the result in this case had to do the work anyway, and in such a case… why bother with the LLM that can’t be trusted to pull the data anyway?

      I suppose they could take the facts and figures that a human pulled and have an LLM verbose it up for people who for whatever reason want needlessly verbose BS. Or maybe an LLM can do a review of the human generated report to help identify potential awkward writing or inconsistencies. But delegating work that you have to do anyway to double check the work seems pointless.

      • pseudo@jlai.lu
        link
        fedilink
        arrow-up
        1
        ·
        4 days ago

        Like someone here said “trust is also thing”. Once you check a few time that the process is right and the result are right, you don’t need to check more than ponctually. Unfortunatly, that’s not what happened in this story.