• thisisnotgoingwell@programming.dev
      link
      fedilink
      arrow-up
      123
      arrow-down
      2
      ·
      11 months ago

      I’m guessing he’s saying companies are still using the same human written code, but since AI is sexy right now and is being used to describe even simple programming logic, everything is “powered by AI”

      • andrew@lemmy.stuart.fun
        link
        fedilink
        English
        arrow-up
        49
        ·
        11 months ago

        And in 2013 the key word for marketing was algorithm. The YouTube algorithm. The reddit algorithm. Etc.

      • fidodo@lemm.ee
        link
        fedilink
        arrow-up
        10
        arrow-down
        1
        ·
        11 months ago

        That was true like 5 years ago, but now companies are just irresponsibly calling out to LLMs as a function without proper safe guards instead.

      • kautau@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        11 months ago

        Even more likely is that AI’s that write code are trained on human created code. So they aren’t coming up with new, novel ideas to problems in most cases, they are just a far more advanced “copy and paste from StackOverflow”

          • kautau@lemmy.world
            link
            fedilink
            arrow-up
            7
            ·
            edit-2
            11 months ago

            Hey just remember the classic Quora answer:

            https://www.quora.com/Why-should-I-hire-a-software-engineer-if-I-can-just-copy-and-paste-code-from-Stack-Overflow

            They are paying $100,000. $1 to copy and paste code from stack overflow, and $99,999 to know where and when to paste the code and how to make it work.

            Domain knowledge is real, and AI might level that up, but you’ll be hard pressed to find a junior engineer armed with the same tools as a senior engineer that gets dropped into a gig and can properly utilize AI or even StackOverflow to be on the same playing field. AI can write me a function. But to figure how broken a legacy codebase is and how that function can solve an issue is why engineers are still valuable…for now

      • r00ty@kbin.life
        link
        fedilink
        arrow-up
        3
        ·
        11 months ago

        I’ve heard this talk where I work. Senior plebs describing things that are obviously algorithms as AI. And this of course means we had AI before it was cool.

        Nothing new here. Buzzwords are the only thing senior managers can understand.

        • fidodo@lemm.ee
          link
          fedilink
          arrow-up
          6
          arrow-down
          1
          ·
          11 months ago

          Nobody can seem to consistently define what ai even means

        • theycallmebeez@lemmy.ml
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          9
          ·
          11 months ago

          ChatGPT is built upon a GPT language model, which is a type of Artificial Intelligence.

          • Xylight (Photon dev)@lemmy.xylight.dev
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            5
            ·
            edit-2
            11 months ago

            (This isn’t my opinion, just saying what I think they are)

            They are saying it’s not intelligent in any way though. It sees a bunch of words as numbers and spits out some new numbers that the prediction algorithm creates.

            • LoafyLemon@kbin.social
              link
              fedilink
              arrow-up
              17
              ·
              edit-2
              11 months ago

              What you’re thinking of as AI is actually a narrower version, while true intelligence is termed AGI.

              Explanation:
              The term ‘AI’ (Artificial Intelligence) refers to computer systems that can perform tasks that would typically require human intelligence, like recognizing patterns or making decisions. However, most AI systems are specialized and focused on specific tasks.

              On the other hand, ‘AGI’ (Artificial General Intelligence) refers to a higher level of AI that possesses human-like cognitive abilities. AGI systems would be capable of understanding, learning, and applying knowledge across a wide range of tasks, much like us.

              So, the distinction lies in the breadth of capabilities: AI refers to more specialized, task-focused systems, while AGI represents a more versatile and human-like intelligence.

              • BlinkAndItsGone@lemm.ee
                link
                fedilink
                arrow-up
                7
                arrow-down
                1
                ·
                edit-2
                11 months ago

                The term ‘AI’ (Artificial Intelligence) refers to computer systems that can perform tasks that would typically require human intelligence,

                That’s everything computers do, though, isn’t it? Pocket calculators would have fit this definition of AI in the 1970s. In the '60s, “computer” was a human job title.

            • jadero@programming.dev
              link
              fedilink
              arrow-up
              3
              ·
              11 months ago

              Fair enough. What evidence have you got that it’s any different than what humans do? Have you looked around? How many people can you point to that are not just regurgitating or iterating or recombining or rearranging or taking the next step?

              As far as I can tell, much of what we call intelligent activity can be performed by computer software and the gaps get smaller every year.

            • Yendor@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              5
              ·
              11 months ago

              That’s not how ChatGPT works.

              GPT is an LLM that use RNN. An RNN (Recurrent neural network) is not an algorithm.

                • Yendor@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  10 months ago

                  Yea, but not really. The algorithms are available for free, but they don’t do anything useful by themselves. The RNN is built by training the neural net, which uses grading/classification of training data to increase or decrease millions of coefficients of a multi-layer filter. It’s the training data, the classification feedback and the processing power that actually creates the AI.