• Cethin@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I would describe it more as giving the results we’re asking for rather than doing what we tell it to, but that’s a little bit of too much semantics probably. We mostly don’t tell it what to do. We just give it data with some labels and it tries to generate reasons for those labels basically. It’s essentially the issue humans have of “correlation does not equal causation” except with no awareness of this and significantly worse.