• FiniteBanjo@feddit.online
    link
    fedilink
    English
    arrow-up
    1
    ·
    18 days ago

    TBF OpenAI are a bunch of idiots running the world’s largest ponzi scheme. If DeepMind tried it and failed then…

    Well I still wouldn’t be surprised, but at least it would be worth citing.

    • chickenf622@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      17 days ago

      I think the inherit issue is the current “AI” is inherently non-deterministic, so it’s impossible to fix these issues totally. You can feed am AI all the data on how to not sound AI, but you need massive amounts of non-AI writing to reinforce that. With AI being so prevalent nowadays you can’t guarantee a dataset nowadays is AI free, so you get the old “garbage in garbage out” problem that AI companies cannot solve. I still think generative AI has it’s place as a tool, I use it for quick and dirty text manipulation, but it’s being applied to every problem we have like it’s a magic silver bullet. I’m ranting at this point and I’m going to stop here.

      • FiniteBanjo@feddit.online
        link
        fedilink
        English
        arrow-up
        1
        ·
        17 days ago

        I honestly disagree that it has any use. Being a statistical model with high variance makes it a liability, no matter which task you use it for will produce worse results than a human being and will create new problems that didn’t exist before.

        • chickenf622@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          17 days ago

          The high variance is why I only use it for dead simple tasks, e.g. “create and array of US states abbreviations in JavaScript”, otherwise I’m in full agreement with you. If you can’t verify the output is correct the it’s useless.

          • GojuRyu@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            17 days ago

            Wouldn’t that be slower to do, simply because checking it got all states, didn’t repeat any and didn’t make up any would be slower than copying a list from the web and quickly turning that into an array by hand with multiline cursors?

        • frank@sopuli.xyz
          link
          fedilink
          arrow-up
          0
          ·
          17 days ago

          I think the best use is “making filler” so like in a game, having some deep background shit that no one looks at, or making a fake advertisement in a cyberpunk type game. Something to fill the world out that reduces the work of real artists if they choose to

          • FiniteBanjo@feddit.online
            link
            fedilink
            English
            arrow-up
            0
            ·
            17 days ago

            If you can’t be bothered to write filler then it’s an insult for you to expect others to read it. You’re just wasting people’s time.

            • frank@sopuli.xyz
              link
              fedilink
              arrow-up
              0
              ·
              17 days ago

              I guess the point is for people to not read the filler.

              I think of the text that’s too small to read on a computer in the background. It’s nice that it’s slightly more real looking than a copy/paste screen.

              Not even close to worth destroying the environment over, but it’s a neat use case to me

              • Catoblepas@piefed.blahaj.zone
                link
                fedilink
                English
                arrow-up
                1
                ·
                17 days ago

                I think of the text that’s too small to read on a computer in the background.

                Lorem ipsum has been used in typesetting since the 60s. If it’s not meant to be read, it doesn’t matter if it’s lorem ipsum text.

                Not trying to dogpile you, I just think even things that seem ‘useful’ for LLMs almost always have preexisting solutions that are decades old.

      • vala@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        17 days ago

        FWIW, LLMs are deterministic. Usually the commercial front-ends don’t let you set the seed but behind the scenes the only reason the output changes each time it’s that the seed changes. If you set a fixed seed, input X always leads to output Y.

        • ThirdConsul@lemmy.zip
          link
          fedilink
          arrow-up
          1
          ·
          16 days ago

          From the user perspective: no? I think they called that “temperature” and even setting that to 0 didn’t make the result the same the next day after cache cleared.