Google’s AI-driven Search Generative Experience have been generating results that are downright weird and evil, ie slavery’s positives.

  • scarabic@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    3
    ·
    1 year ago

    If it’s only as good as the data it’s trained on, garbage in / garbage out, then in my opinion it’s “machine learning,” not “artificial intelligence.”

    Intelligence has to include some critical, discriminating faculty. Not just pattern matching vomit.

    • samus12345@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      3
      ·
      edit-2
      1 year ago

      We don’t yet have the technology to create actual artificial intelligence. It’s an annoyingly pervasive misnomer.

      • Flying Squid@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        3
        ·
        1 year ago

        And the media isn’t helping. The title of the article is “Google’s Search AI Says Slavery Was Good, Actually.” It should be “Google’s Search LLM Says Slavery Was Good, Actually.”

    • profdc9@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      Unfortunately, people who grow up in racist groups also tend to be racist. Slavery used to be considered normal and justified for various reasons. For many, killing someone who has a religion or belief different than you is ok. I am not advocating for moral relativism, just pointing out that a computer learns what is or is not moral in the same way that humans do, from other humans.

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        1 year ago

        You make a good point. Though humans at least sometimes do some critical thinking between absorbing something and then acting it out.