• Grail@multiverse.soulism.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    if they actually could “think” wouldnt they factcheck themselves first before saying something

    No. They don’t have access to the original training data, or to the internet. They’re stuck remembering it the same way a human remembers something: with neurons. They cannot search the dataset for you. The best they can do is remember and tell you.

    • reksas@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      but they do have access to internet? At least gpt can search based on the text it outputs when its processing the query

      • Grail@multiverse.soulism.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Really? Must be a new feature, it didn’t when I tried it. I know they can execute code, I guess the engineers added a search tool. Regardless, that tool isn’t part of their fundamental design. It’s something they have to go and access, and most of the time they won’t. If you were to experiment by asking it to write a scientific paper, you’d find the references are garbage with broken links and nonexistent papers. Hallucinations. It’s just making something plausible sounding up, the same as a lazy human might.

        • reksas@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          23 hours ago

          yeah, i think that is because it knows how research papers should look like and how references look like, but since it has no reasoning, it will just do whatever. I used gpt to diagnose my problem with internet getting cut off and it determined its because of drivers, which sounds reasonable. Then it suggested that i download the latest ones and it did link to correct website but it also tried to download stuff that doesnt exist. No idea how it determined the version numbers and such, maybe based on earlier patterns.

          But it isnt making stuff up, its just outputting the best data it can based on what it has been trained with and what it can find. Its not lazyness but just doing what its doing. Just like code that isnt doing what you want it to do isnt doing it out of malice but because there is a mistake in the code.

          • Grail@multiverse.soulism.net
            link
            fedilink
            English
            arrow-up
            1
            ·
            15 hours ago

            It doesn’t have access to the training data. It’s not outputting training data, it’s making up something that feels like the training data.