• chaosCruiser@futurology.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    11 days ago

    ”However, if it is performance you are concerned about, “it’s important to note that GPUs still far outperform NPUs in terms of raw performance,” Jessop said, while NPUs are more power-efficient and better suited for running perpetually.”

    Ok, so if you want to run your local LLM on your desktop, use your GPU. If you’re doing that on a laptop in a cafe, get a laptop with an NPU. If you don’t care about either, you don’t need to think about these AI PCs.

      • chaosCruiser@futurology.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 days ago

        It’s a power efficiency thing. According to the article, a GPU gets the job done, but uses more energy to get there. Probably not a big deal unless charging opportunities are scarce.