A new paper suggests diminishing returns from larger and larger generative AI models. Dr Mike Pound discusses.

The Paper (No “Zero-Shot” Without Exponential Data): https://arxiv.org/abs/2404.04125

  • @Murvel@lemm.ee
    link
    fedilink
    English
    7
    edit-2
    2 months ago

    What you mentioned is assumed video and paper in question.

    The main argument being that no matter our computational techniques, the diminishing returns in predictive precision is reached far sooner than we achieve general intelligence.

    • @Womble@lemmy.world
      link
      fedilink
      English
      22 months ago

      No the argument is current techniques give logarithmic returns in data size, which is bad. But it said nothing about other potential techniques or made any suggestion that this was a general result.

      • @Murvel@lemm.ee
        link
        fedilink
        English
        32 months ago

        Well obviously they cannot rule out techniques no one has though of but likewise they obviously accounted for what they deemed to be within the realm of possibility

    • @boyi@lemmy.sdf.org
      link
      fedilink
      English
      2
      edit-2
      2 months ago

      no matter our computational techniques, the diminishing returns in predictive precision is reached far sooner than we achieve general intelligence

      That’s very bold presumption. How can they be so sure of this, that any future models can’t tackle the issue? have they got proof or something.

      • @Murvel@lemm.ee
        link
        fedilink
        English
        22 months ago

        No, they just calculate with increased size of the training roster… it’s not that complicated. Which is a fair presumption as that is how we’ve increased the predictive precision so far.

      • @technocrit@lemmy.dbzer0.com
        link
        fedilink
        English
        1
        edit-2
        2 months ago

        It seems far more bold to presume that general intelligence will be created any time soon when current machine learning is nowhere close.