In the last few months, it’s been more rare that my model just made stuff up

But now, it searches for almost every query even if asked not to search and it makes up nonsense too

For instance I asked if about small details in video games and it told me “the music box stops playing when Sarah dies”. There is no music box. This is nonsense

  • Lembot_0003@lemmy.zipBanned
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    12 days ago

    Yes, and the system can figure out the correct answer as soon as you point out that hallucination is wrong. Somehow ChatGPT is even more unwilling to say “no” or “don’t know” recently.