“Our primary conclusion across all scenarios is that without enough fresh real data in each generation of an autophagous loop, future generative models are doomed to have their quality (precision) or diversity (recall) progressively decrease,” they added. “We term this condition Model Autophagy Disorder (MAD).”

Interestingly, this might be a more challenging problem as we increase the use of generative AI models online.

  • FaceDeer
    link
    fedilink
    21 year ago

    Humans are not entirely trained on other humans, though. We learn plenty of stuff from our environment and experiences. Note this very important part of the primary conclusion:

    without enough fresh real data in each generation

    • lol3droflxp
      link
      fedilink
      11 year ago

      Math for example is something one could argue is purely taught by humans.

      • FaceDeer
        link
        fedilink
        3
        edit-2
        1 year ago

        Dogs can do math and I’m quite sure I’ve never taught my dog that deliberately.

        Even for humans learning it, I would expect that most of our understanding of math comes from everyday usage of it rather than explicit rote training.