I’m now convinced that most holodeck malfunctions are the result of end users, who don’t know what they’re doing, using AI to generate poorly-written software they’re ill-equipped to debug or even really understand.

  • Nomecks@lemmy.ca
    link
    fedilink
    arrow-up
    44
    ·
    edit-2
    4 months ago

    All the holodeck shows is that even in the 24th century people still don’t know how to properly implement least privilege access controls.

  • grue@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    1
    ·
    4 months ago

    Speak for yourself. The D’s holodeck was always fucking up and sending historical villains running amok and whatnot, but Quark’s holosuites always worked fine.

    • jrs100000@lemmy.world
      link
      fedilink
      English
      arrow-up
      27
      ·
      4 months ago

      Yea, cause the Binars fucked with in season 1 specifically to give it the ability to make sapient AIs. Apparently nobody ever did a factory reset after that.

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        ·
        edit-2
        4 months ago

        Nah, that same upgrade got rolled out industry-wide. Remember Vic Fontaine?

        • Melllvar@startrek.website
          link
          fedilink
          English
          arrow-up
          5
          ·
          4 months ago

          My headcanon is that Dr. Zimmerman received the Moriarty program from Barclay, and discovered from it how to make sentient holograms, which is why holograms started getting sentient more regularly and by design.

          • grue@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            4 months ago

            If you look at Encounter at Farpoint, they’re ooh-ing and aah-ing over the thing like they’ve never seen one before. Meta considerations of exposition for the audience aside, it was clearly a brand-new technology. Considering how rapidly it proliferated and advanced – Quark was running holosuites within five years, and the EMH mark 1 was online within seven – it seems to me that the capability for sentience was probably there all along/obvious low-hanging fruit, and the D’s holodeck’s initial limitations were just an early-adopter thing.

        • jrs100000@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          4 months ago

          Not at all. Vic was running on Quark’s holosuites. They were not owned, serviced or built by the Federation.

  • AngryishHumanoid@reddthat.com
    link
    fedilink
    arrow-up
    24
    ·
    4 months ago

    I mean 2 of the smartest people on the Enterprise D accidentally created an artificial lifeform and had no safeguards to prevent it from happening.

    • Admiral Patrick@dubvee.orgOP
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      4 months ago

      I guess ‘accidental, megalomaniacal sapience’ is technically a holodeck malfunction, lol. I wasn’t even thinking of that incident.

    • Flying Squid@lemmy.worldM
      link
      fedilink
      arrow-up
      5
      ·
      4 months ago

      Really just one person- Geordie- through an accidental misphrasing of a request to the computer.

      I would have never used that computer again. Or at least given it a complete overhaul. You shouldn’t be allowed to do the sort of thing you’d request of Dall-E in order for the computer to create intelligent life.

      • sundray@lemmus.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        4 months ago

        Yeah, they could stand to at least add a, “This request will result in the creation a sentient being, are you sure you wish to proceed?” warning.

        • Flying Squid@lemmy.worldM
          link
          fedilink
          arrow-up
          5
          ·
          4 months ago

          Really, a lot of Star Trek problems could be averted with a “are you sure you want to do this?” before someone does something with the computer or the holodeck. Starfleet apparently never learns. That’s why in Prodigy-

          spoiler

          Janeway goes back to the Delta Quadrent in a different ship of a different but similar-looking class renamed Voyager.

      • JWBananas@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        Frankly I would posit that present-day LLMs demonstrate exactly why Moriarty wasn’t even necessarily sapient.

  • jawa21@lemmy.sdf.org
    link
    fedilink
    arrow-up
    17
    ·
    4 months ago

    I haven’t been a professional developer in a long time, but I feel like this needs to be said; No, the end user is never the problem. There is no such thing as a use case that is not the end user, because as soon as you use it you become the end user. The entire point is to make the product usable. No, the end user breaking things is not some kind of moment to laugh at them - you should be embarrassed.

    That being said, I lol’d.

    • Admiral Patrick@dubvee.orgOP
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      4 months ago

      I guess this is a bit of a crossover situation. The end-user is the developer in this scenario. I guess you could also say they’re the end-user of the AI software as well, but that’s not software I’m responsible for (and have advocated against)

      The IRL back story of the post is my org is experimenting with letting end-users use AI-assisted, no-code platforms to develop line-of-business software. Sounds good on paper, but they don’t understand anything they create, and when it doesn’t work or otherwise produces unexpected results (which is often), it suddenly becomes my problem to debug the unmaintainable crap the “AI” spits out or click around 10,000 times in the Playskool “development” GUI the platform uses to find the flaw in their Rube Goldbergian-logic.

      It’s basically like I’m Geordi or O’Brien and am getting called all day and night to debug people’s fanfiction holonovels. lol.

      Which is super annoying because had they just asked or put in a request, I or someone on my team could have developed something properly that could be easily maintained, version-controlled, extended, and such.

  • OpenStars
    link
    fedilink
    English
    arrow-up
    13
    ·
    4 months ago

    “Anything that can be used can be misused.”

    - me, just now, plus probably a bunch of people long dead by now:-)

  • TheGoldenGod@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    4 months ago

    Would I be asking to much if I asked to hear more of these ideas? lol Over the decades, I thought of dozens per an episode of Star Trek that ended up forgotten and would love to hear more like these.

    • ummthatguy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      A lot of these ideas/questions tend to become a post to Startrek via lemmy.world. If the quandry is more oddball/silly/pedantic, I suppose it would get more attention when wrapped into the notion of “Sonic Shower Thoughts” and posted here.

  • Infynis@midwest.social
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    2
    ·
    4 months ago

    I think it’s proof that AI isn’t inherently a problem, it’s capitalism that is. Being able to create a fun video game adventure for your friends like Boimler and Tom did is awesome

    • Admiral Patrick@dubvee.orgOP
      link
      fedilink
      English
      arrow-up
      9
      ·
      4 months ago

      There’s definitely that.

      But there’s also the aspect that’s like building your own helicopter (with no degree in helicopterology), and then inviting your friends for a ride. lol

    • grue@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 months ago

      Quark had a vested financial interest in not letting his holosuites put his customers in danger. As a result, they had basically zero problems over the entire run of DS9.

      I don’t want to give capitalism too much credit, though. I’d prefer to attribute the high rate of mishaps on the Enterprise to the effect on the holodeck circuitry of all the negative space wedgies they were constantly flying through.

  • Buddahriffic@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    4 months ago

    I’d argue that while fiction can present good arguments about things, it can’t prove anything because the conclusion was decided and written for rather than an outcome of a series of events and choices.