• Dizzy Devil Ducky@lemm.ee
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    3 months ago

    I’d believe AI will replace human programmers when I can tell it to produce the code for a whole entire video game in a single prompt that is able to stand up to the likes of New Vegas, has zero bugs, and is roughly hundreds of hours of content upon first play due to vast exploration.

    In other words, I doubt we’ll see human programmers going anywhere any time soon.

    Edit:

    Reading other replies made me remember how I once, for fun, tried using a jailbroken copilot program to do python stuff slightly above my already basic coding skill and it gave me code that tried importing something that absolutely doesn’t exist. I don’t remember what it was called ince I deleted the file while cleaning up my laptop the other day, but I sure as hell looked it up before deleting it and found nothing.

    • SSJMarx@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      3 months ago

      The best copilot can do is autofill lines that everyone’s written a million times. That’s not nothing, but it aint replacing a human brain any time soon.

    • DanForever@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      Could you imagine Microsoft replacing windows engineers with a chat gpt prompt? What would that prompt even look like?

      • nickwitha_k (he/him)@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        To be honest, this could be an example of where AI could do marginally better. I don’t mean that because of code quality or functionality. I mean it in the sense of MS software getting absolutely fucked by internal competition and stack-ranking fostered during the Balmer years. The APIs are inconsistent and there is a ton of partially implemented stuff that will never be pushed to completion because everyone who worked on it was fired.

        An AI might be able to implement things without intentionally sabotaging itself but, because LLMs are in the forefront of what would be used and do not have the capability of intention or understanding context, I’m a bit pessimistic.