It’s built in layers, and the layers that are improving are not the LLMs themselves, it’s the layers that interact between the user and the LLM that are improving, which creates the illusion that the LLMs are improving. They’re not. TropicalDingdong knows what they’re talking about, you should listen to them.
If you continue to improve the layers between the LLM and the user long enough, you’ll end up with something that we traditionally used to call a “software program” that is optimized for accomplishing a task, and you won’t need an LLM much if at all.




While clankers are not completely immune to fire, I’d recommend a pair of extremely high voltage electrodes instead. It would make a very nice light show, and there will probably still be some fire.