meyotch@slrpnk.nettoTechnology@lemmy.world•Gemini AI tells the user to die — the answer appeared out of nowhere when the user asked Google's Gemini for help with his homeworkEnglish
232·
7 days agoI suspect it may be due to a similar habit I have when chatting with a corporate AI. I will intentionally salt my inputs with random profanity or non sequitur info, for lulz partly, but also to poison those pieces of shits training data.
SafeRent was a giant piece of shit before “AI”. I tried to rent a place 15 years ago that used them. The report returned several serious felonies committed over years by another person with an only vaguely similar name who lived in a state I had never even visited.
The leasing office people admitted that the report was transparently bogus, but they still had orders to deny all housing to negative reports.
My only recourse at the time was to lock my record so they won’t issue reports in my name at all. I now ask right up front who a renter uses for screening and they get a vigorous ‘nope’ if they use SafeRent.
Fsck SafeRent!