• frog 🐸@beehaw.org
    link
    fedilink
    English
    arrow-up
    8
    ·
    14 days ago

    I also suspect, based on the accuracy of AIs we have seen so far, that their interpretation of the deceased’s personality would not be very accurate, and would likely hallucinate memories or facts about the person, or make them “say” things they never would have said when they were alive. At best it would be very Uncanny Valley, and at worst would be very, very upsetting for the bereaved person.

    • Zaktor@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      14 days ago

      This is a very patronizing view of people who all seem to be well informed about what this is and isn’t and who have already acknowledged that they will put it aside if it scares them. No one is foisting this on the bereaved wife and the husband has preemptively said it’s ok if her or her children never use it.

      This might fail in all the ways you think it will. That’s a very small dataset of information, so it’s likely to be either be an overcomplicated recording or to need to incorporate training other than what he personally said, but it’s not your place to tell her what’s best for her personal grieving process.

      • frog 🐸@beehaw.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        13 days ago

        Given the husband is likely going to die in a few weeks, and the wife is likely already grieving for the man she is shortly going to lose, I think that still places both of them into the “vulnerable” category, and the owner of this technology approached them while they were in this vulnerable state. So yes, I have concerns, and the fact that the owner is allegedly a friend of the family (which just means they were the first vulnerable couple he had easy access to, in order to experiment on) doesn’t change the fact that there are valid concerns about the exploitation of grief.

        With the way AI techbros have been behaving so far, I’m not willing to give any of them the benefit of the doubt about claims of wanting to help rather than make money - such as using a vulnerable couple to experiment on while making a “proof of concept” that can be used to sell this to other vulnerable people.

        • Zaktor@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          13 days ago

          So just more patronizing. It’s their life, you don’t know better than them how to live it, grief or no.

          • frog 🐸@beehaw.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            13 days ago

            Nope, I’m just not giving the benefit of the doubt to the techbro who responded to a dying man’s farewell posts online with “hey, come use my untested AI tool!”

    • trev likes godzilla@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      14 days ago

      I have no doubts about that either, myself. Though even if such an abomination of a doppelganger were to exist, and it seems that these companies are hellbent on making it so, it would be worse for the reasons you described previously: prolonging and molesting the grieving process that human beings have evolved to go through. All in the name of a dollar. I apologize for being so bitter about this (this bitterness is not directed at you, frog), but this entire "AI’ phenomenon fucking disgusts and repulses me so much I want to scream.

      • frog 🐸@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        13 days ago

        I absolutely, 100% agree with you. Nothing I have seen about the development of AI so far has suggested that the vast majority of its uses are grotesque. The few edge cases where it is useful and helpful don’t outweigh the massive harm it’s doing.

    • intensely_human@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      13 days ago

      I think it would be the opposite of upsetting, but in an unhealthy way. I think it would snap them out of their grief into a place of strangeness, and theyd stop feeling their feelings.

      There is no cell of my gut that likes this idea.