‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • Grangle1@lemm.ee
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    4
    ·
    1 year ago

    Regardless of feelings on that subject, there’s also the creep factor of people making these without the subjects’ knowledge or consent, which is bad enough, but then these could be used in many other harmful ways beyond one’s own… gratification. Any damage “revenge porn” can do, which I would guess most people would say is wrong, this can do as well.

    • ByteJunk@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      11 months ago

      I don’t think they’re really comparable?

      These AI pictures are “make believe”. They’re just a guess at what someone might look like nude, based on what human bodies look like. While apparently they look realistic, it’s still a “generic” nude, kind of how someone would fantasize about someone they’re attracted to.

      Of course it’s creepy, and sharing them is clearly unacceptable as it’s certainly bullying and harassment. These AI nudes say more about those who share them than they do about who’s portrayed in them.

      However, sharing intimate videos without consent and especially as revenge? That’s a whole other level of fucked up. The AI nudes are ultimately “lies” about someone, they’re fakes. Sharing an intimate video, that is betraying someone’s trust, it’s exposing something that is private but very real.