• pixxelkick@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    4
    ·
    16 days ago

    Once a model is trained, they become functionally opaque. Weights shift… but WHY. What does that vector MEAN. True, but I guess my point is a lot of people ascribe, as you pointed out, way more “spirit” or “humanity” to what an LLM is, whereas in reality its actually a pretty simple lil box. Numbers go in, numbers come out, and all it does is guess what the next number is gonna be. Numbers go BRRRRRRRRR

    I think where I fundamentally disagree is that “they do what they say they do” by any definition beyond the simple tautology that everything is what it is.

    I guess I was referring to when theres a lot of tools out there that are built to do stuff other than what it outta do.

    Like stick a flashlight onto a wrench if you will. Now its not just a wrench, now its a flashlight too.

    But an LLM is… pretty much just what it is, though some people now are trying pretty hard to make it be more than that (and not by adding layers overtop, Im talking about training LLMs to be more than LLMs, which I think is a huge waste of time)