• Fizz@lemmy.nz
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    And what? Half the shit on Google is completely wrong as well.

    • noodlejetski@kbin.social
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 year ago

      Google actually pulls results from web pages.

      you know how some smartphone keyboards predict the next word that you’re going to use, and you can form a comprehensible sentence that sometimes even makes sense by simply tapping the next word on the prediction bar over and over? that’s what those language models do. they don’t actually search for anything, they just create sequences of words that sound probable.

      • Fizz@lemmy.nz
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It seems that Bing chat bot searches then reads the results and gives you the answer.

        I know it’s basically predictive text but if the prompt contains a relevant info then the predictive text is likely to be the answer you’re looking for so it works well.

    • Bloonface@kbin.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Yeah but you can tell from the context that search results are just a list of random web pages that maybe what Google says is bollocks.

      Google gives you a bunch of results and says “here, look at these”. LLMs confidently tell you things that they may have simply made up and present them as if they’re real.