• iridaniotter [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    19
    ·
    11 months ago

    Algorithms are racist because they are trained on racist data because the data comes from a racist society. It’s more likely that it regurgitates Israeli propaganda because the Internet is full of it (and with less push back back when this LLM was trained) than because the company made sure to dial up the racism meter.

    • happybadger [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      11 months ago

      Porque no los dos? Cops are racist because they enforce racist law for a racist society, but there are still individual racists deciding what those racist laws are and how they’re enforced. There are already filters in place to make sure the data sets it’s being trained on aren’t random gibberish and that they don’t generate illegal/unethical responses. Those are intentional individual choices from him and his team.

    • happybadger [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      12
      ·
      11 months ago

      Of course there will be a consequence! Underpaid and overworked teachers eventually using these LLMs to make lesson plans for their students. Underfunded school systems eventually replacing teachers with these. The responses being the foundation of every lazy internet reply, half-assed search, and autogenerated script. A society of medieval peasants asking the magic computer for its divine wisdom and not questioning its logic, like a skinner box form of how facebook has eaten the minds of most people over age 40.