Several big businesses have published source code that incorporates a software package previously hallucinated by generative AI.

Not only that but someone, having spotted this reoccurring hallucination, had turned that made-up dependency into a real one, which was subsequently downloaded and installed thousands of times by developers as a result of the AI’s bad advice, we’ve learned. If the package was laced with actual malware, rather than being a benign test, the results could have been disastrous.

  • residentmarchant@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    7 months ago

    I find if I write one or two tests on my own then tell Copilot to complete the rest of them it’s like 90% correct.

    Still not great but at least it saves me typing a bunch of otherwise boilerplate unit tests.

    • penquin@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 months ago

      I actually haven’t tried it this way. I just asked it to write the tests for whatever class I was on and it started spitting some stuff at me. I’ll try your way and see.