cron@feddit.orgtoTechnology@lemmy.world•OpenAI, Google, Anthropic admit they can’t scale up their chatbots any furtherEnglish
1046·
2 days agoIt’s absurd that some of the larger LLMs now use hundreds of billions of parameters (e.g. llama3.1 with 405B).
This doesn’t really seem like a smart usage of ressources if you need several of the largest GPUs available to even run one conversation.
I don’t think your brain can be reasonably compared with an LLM, just like it can’t be compared with a calculator.