• pixxelkick@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      16
      ·
      15 days ago

      When using the word “they”, in English it refers the the last primary subject you referred to, so you should be able to infer what “they” referred to in my sentences. I’ll let you figure it out.

      “I love wrenches, they are very handy tools”, in this sentence, the last subject before the word “they” was “wrenches”, so you should be able to infer that “they” referred to “wrenches” in that sentence.

      • Windex007@lemmy.world
        link
        fedilink
        arrow-up
        21
        arrow-down
        1
        ·
        15 days ago

        Ok, well, I was actively trying to avoid jumping to the conclusion that your assertion was that an LLM can tell you what it does.

        I was actively avoiding that conclusion as an act of charity.

          • Windex007@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            14 days ago

            Hence my attempt to give you the space to provide clarity.

            For me, this isn’t a pissing contest. I’m trying to provide you with the latitude to clarify your position. I’ll be honest, I didn’t appreciate your condescending lecture on the english language.

            • pixxelkick@lemmy.world
              link
              fedilink
              arrow-up
              1
              arrow-down
              5
              ·
              14 days ago

              I apologize for any confusion.

              I meant LLMs are what they say they are in a non literal sense.

              Akin to abscribing the same to any other tool.

              “I like wrenches cause they are what they say they are, nothing extra to them” in that sort of way.

              In the sense the tool is very transparent in function. No weird bells or whistles, its a simple machine that you can see what it does merely by looking at it.

              • Windex007@lemmy.world
                link
                fedilink
                arrow-up
                5
                ·
                14 days ago

                I think I understand your point now.

                I still would want to apply pressure to it, because i disagree with the spirit of your assessment.

                Once a model is trained, they become functionally opaque. Weights shift… but WHY. What does that vector MEAN.

                I think wrenches are good. Will a 12mm wrench fit a 12mm bolt? Yes.

                In LLM bizarre world, the answer to everything is not “yes” or “no”, it’s “maybe, maybe not, within statistical bounds… try it… maybe it will… maybe it won’t… and by the way just because it fit yesterday is no guarantee it will fit again tomorrow… and I actually can’t definitively tell you why that is for this particular wrench”

                LLMs do something, and I agree they do that something well. I further agree with the spirit of most of the rest of your analysis: abstraction layers are doing a lot of heavy lifting.

                I think where I fundamentally disagree is that “they do what they say they do” by any definition beyond the simple tautology that everything is what it is.

                • pixxelkick@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  4
                  ·
                  14 days ago

                  Once a model is trained, they become functionally opaque. Weights shift… but WHY. What does that vector MEAN. True, but I guess my point is a lot of people ascribe, as you pointed out, way more “spirit” or “humanity” to what an LLM is, whereas in reality its actually a pretty simple lil box. Numbers go in, numbers come out, and all it does is guess what the next number is gonna be. Numbers go BRRRRRRRRR

                  I think where I fundamentally disagree is that “they do what they say they do” by any definition beyond the simple tautology that everything is what it is.

                  I guess I was referring to when theres a lot of tools out there that are built to do stuff other than what it outta do.

                  Like stick a flashlight onto a wrench if you will. Now its not just a wrench, now its a flashlight too.

                  But an LLM is… pretty much just what it is, though some people now are trying pretty hard to make it be more than that (and not by adding layers overtop, Im talking about training LLMs to be more than LLMs, which I think is a huge waste of time)