Meta’s AI image generator is coming under fire for its apparent struggles to create images of couples or friends from different racial backgrounds.

  • TheChurn@kbin.social
    link
    fedilink
    arrow-up
    17
    ·
    7 months ago

    “AI” isn’t needed to solve optimization problems, that’s what we have optimization algorithms for.

    Define an objective and parameters and give the problem to any one of the dozens of general solvers and you’ll get approximate answers. Large cities already use models like these for traffic flow, there’s a whole field of literature on it.

    The one closest to what you mentioned is a genetic algorithm, again a decades-old technique that has very little in common with Generative “AI”

    • Chozo@fedia.io
      link
      fedilink
      arrow-up
      3
      ·
      7 months ago

      The problem comes from not knowing what the optimal algorithm is for the particular situation beforehand, though. You can only formulate based on factors you already know. That’s why AI can train itself and develop its own, new algorithms, and can determine those unknown factors that may have gone unnoticed by humans.

      • TheChurn@kbin.social
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        7 months ago

        No, that’s not a real problem either. Model search techniques are very mature, the first automated tools for this were released in the 90s, they’ve only gotten better.

        AI can’t ‘train itself’, there is no training required for an optimization problem. A system that queries the value of the objective function - “how good is this solution” - then tweaks parameters according to the optimization algorithm - traffic light timings - and queries the objective function again isn’t training itself, it isn’t learning, it is centuries-old mathematics.

        There’s a lot of intentional and unintentional misinformation around what “AI” is, what it can do, and what it can do that is actually novel. Beyond Generative AI - the new craze - most of what is packaged as AI are mature algorithms applied to an old problem in a stagnant field and then repackaged as a corporate press release.

        Take drug discovery. No “AI” didn’t just make 50 new antibiotics, they just hired a chemist who graduated in the last decade who understands commercial retrosynthetic search tools and who asked the biopharma guy what functional groups they think would work.