Probably the skepticism is around someone actually trusting the LLM this hard rather than the LLM doing it this badly. To that I will add that based on my experience with LLM enthusiasts, I believe that too.
I have talked to multiple people who recognize the hallucination problem, but think they have solved it because they are good “prompt engineers”. They always include a sentence like “Do not hallucinate” and thinks that works.
The gaslighting from the LLM companies is really bad.
There are ways to get more relevant info (when using terms that have different meanings based on context), to reduce the needless ass kissing, and to help ensure you get response in formats more useful to you. But being able to provide it context is not some magic fix for the underlying problems of the way this tech is constructed and its limitations. It will never be trustworthy.
Edit: God forbid anyone want our criticism to be based of an understanding of this shit rather than pure vitriol and hot takes.
Probably the skepticism is around someone actually trusting the LLM this hard rather than the LLM doing it this badly. To that I will add that based on my experience with LLM enthusiasts, I believe that too.
I have talked to multiple people who recognize the hallucination problem, but think they have solved it because they are good “prompt engineers”. They always include a sentence like “Do not hallucinate” and thinks that works.
The gaslighting from the LLM companies is really bad.
“Prompt engineering” is the astrology of the LLM world.
There are ways to get more relevant info (when using terms that have different meanings based on context), to reduce the needless ass kissing, and to help ensure you get response in formats more useful to you. But being able to provide it context is not some magic fix for the underlying problems of the way this tech is constructed and its limitations. It will never be trustworthy.
Edit: God forbid anyone want our criticism to be based of an understanding of this shit rather than pure vitriol and hot takes.