After several months of reflection, I’ve come to only one conclusion: a cryptographically secure, decentralized ledger is the only solution to making AI safer.

Quelle surprise

There also needs to be an incentive to contribute training data. People should be rewarded when they choose to contribute their data (DeSo is doing this) and even more so for labeling their data.

Get pennies for enabling the systems that will put you out of work. Sounds like a great deal!

All of this may sound a little ridiculous but it’s not. In fact, the work has already begun by the former CTO of OpenSea.

I dunno, that does make it sound ridiculous.

  • ABoxOfNeurons@lemmy.one
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    7
    ·
    1 year ago

    After going in suspicious, this actually sounds like a pretty decent idea.

    The technology isn't stopping or going away any more than the cotton gin did. May as well put control in as many hands as possible. The alternative is putting it under the sole control of a few megacorps, which seems worse. Is there another option I'm not seeing?

    • Windex007@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      The guy is the epitome of a "bro". He has a vague understanding of a philosophical problem, boiled it down into a false dichotomy, and then invented in his head a solution that simultaneously demonstrates he has no comprehension whatsoever of LLMs and only a basic working model of the blockchain

    • froztbyte@awful.systems
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      Yeah seriously, you should look into VC work as your next career. You have that rare blend of wilful blindness and optimistic gullibility which seem to be job requirements there, you’ll do great!

    • raktheundead@fedia.io
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      1 year ago

      Not using blockchains, for a start. Blockchains centralise by design, because of economies of scale.

    • gerikson@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      1 year ago

      Everyone puts legal limits in place to prevent LLMs from ingesting their content, the ones who break those limits are prosecuted to the full extent of the law, the whole thing collapses in a downward spiral, and everyone pretends it never happened.

      Well, a man can dream.