• hexaflexagonbear [he/him]@hexbear.netOP
    link
    fedilink
    English
    arrow-up
    33
    ·
    6 months ago

    The heatmap on the right in the image shows the error. It gets progressively worse as the numbers get larger. Notably, also, the error is not symmetric in the operands, so the model is not aware that addition is commutative. Even after 2^128 or so training examples (it seems the training set is every pair of unsigned 64-bit integers) it couldn’t figure out that a+b = b+a

    • soiejo [he/him,any]@hexbear.net
      link
      fedilink
      English
      arrow-up
      10
      ·
      6 months ago

      TBH I wouldn’t expect a ML algorithm to “figure out” that addition is commutative, even a good one with acceptable errors (unlike this one); it’s a big logic leap that it is not really suited to get by itself (ofc this just means it is a silly way to try to do addition on a computer)

        • silent_water [she/her]@hexbear.net
          link
          fedilink
          English
          arrow-up
          5
          ·
          6 months ago

          fwiw, commutativity didn’t really get specifically called out by mathematicians until they adopted some kind of symbolic representation (which happened at vastly different times in different places). without algebra, there’s not much reason to spell it out, even if you happen to notice the pattern, and it’s even harder to prove it. (actually… it’s absurdly hard to prove even with it - see the Principia Mathematica…)

          these algorithms are clearly not reasoning but this isn’t an example. yes, it seems obvious and simple now but it short changes how huge of a shift the switch to symbolic reasoning is in the first place. and that’s setting aside whether notions like “memory” and “attention” are things these algorithms can actually do (don’t get me started on how obtuse the literature is on this point).