oh dear. i thought it was a belt, not the addition of a mid rift top…

why would that happen anyway? seems stupid to me. why mid rift midriff!?!

edit: turns out i was a bit too literal… it is a rift in the middle of her clothes…

  • 𝚝𝚛𝚔
    link
    fedilink
    English
    arrow-up
    47
    ·
    11 months ago

    Ah yes, the “moar boobs and add a little bit of sexy tummy tee hee” photoshop filter.

      • sneakattack@lemmy.ca
        link
        fedilink
        arrow-up
        15
        ·
        11 months ago

        Which also requires a written prompt in order to generate the altered clothing after which a few options appear for the user to select and go with. The generated imagery is added as a new layer on top of the base image. It looks like they also added some adjustment layers (like for color or for contrast for example).There are so many steps to this where a human has to make a conscious decision to carry on.

  • mozz@mbin.grits.dev
    link
    fedilink
    arrow-up
    33
    ·
    11 months ago

    They claimed that they outpainted a cropped version of the image, and the image they showed was what the AI came up with and they decided to print it as a real photo.

    I honestly can’t tell whether simply not caring is worse or not than photoshopping her tits bigger “on purpose.”

    • Deceptichum@kbin.social
      link
      fedilink
      arrow-up
      35
      ·
      edit-2
      11 months ago

      It’s completely bullshit because nothing about her in the photo has been outpainted, its purely infilling. The entire midriff has been altered, but its the same height.

    • Carl@mastodon.nzoss.nz
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      11 months ago

      @mozz @palitu
      They had ONE job… They’re not even trying to fight it. Losers.

      Also… Who says AI is free of bias?

      Pretty blatant feature selection and enhancement from the schoolboys writing the code.

          • Deceptichum@kbin.social
            link
            fedilink
            arrow-up
            4
            ·
            11 months ago

            You’re both half wrong.

            There is code behind the scenes, but the training data is separate to the code. The training data is going to be scrapped from the web and it’s going to be biased towards whatever is common on the Internet, the ‘schoolboys’ writing the code aren’t exactly responsible for that, blame the media and what it influences mainly.

            • Carl@mastodon.nzoss.nz
              link
              fedilink
              arrow-up
              2
              ·
              11 months ago

              @Deceptichum @palitu @mozz @ryannathans
              Soz, I forgot the ‘sarchasm’ tag…

              You know, the gap between someone being sarcastic and the person that doesn’t get it.

              With all that said I am very disappointed that the sum total of human achievement for the last few millenia has come down to coding and training and recoding and retraining lumps of silicon to deliver half baked results that *still* have to be verified and are therefore not worth the power it takes to run them…

              Waste. Of. Time.

              • mozz@mbin.grits.dev
                link
                fedilink
                arrow-up
                2
                ·
                edit-2
                11 months ago

                Like a lot of technology it depends on what you do with it. A train can carry your stuff more effectively than a mule. You can use it to carry materials to build a university, or raw ingredients for a new drug that you can manufacture at scale, and that’s probably a good thing. You can use it to carry weapons for a war that doesn’t need to happen, or cattle from an increasingly-industrialized food supply, and that’s a bad thing. You can maintain it poorly and spill toxic chemicals. Up to you.

  • naevaTheRat@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    18
    ·
    11 months ago

    Honestly we should bring back stocks.

    A day in the stocks for whatever idiot did this and the fact checking/editorial team that reviews their work would probably help them understand why people might be touchy about how they’re seen in public.

    • palituOP
      link
      fedilink
      arrow-up
      7
      ·
      11 months ago

      add in rotten cabbage and tomatoes…

      • naevaTheRat@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        4
        ·
        11 months ago

        I’m not into torture, on other’s at least. I just think that society might benefit from a bit of public humiliation of those that treat others as less than them.

        • palituOP
          link
          fedilink
          arrow-up
          2
          ·
          11 months ago

          if they are rotten, it shouldn’t hurt, i dont think it is torture, just corporal punishment.

  • RBG@discuss.tchncs.de
    link
    fedilink
    arrow-up
    15
    ·
    11 months ago

    Oh wow, I really didn’t think I’d see an example of “oops, AI did this by mistake, no one else is at fault” so fast after reading this article yesterday. Well, maybe it is the other way round, here they claim AI did it itself, the article is more about humans having to correct AI to make it look like AI is working perfectly…

    • palituOP
      link
      fedilink
      arrow-up
      22
      ·
      edit-2
      11 months ago

      it sounds very much a “blame the machine”, it was not their fault… who has this an an AI workflow.

      if man:
            break
      if woman:
           sexify()
      
  • RIPandTERROR@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    14
    ·
    11 months ago

    “…is common practice, the image was resized to fit our specs. During that process, the automation by Photoshop created an image that was not consistent with the original.”

    Oh God I can’t 😂

  • spiffmeister
    link
    fedilink
    arrow-up
    7
    ·
    11 months ago

    I wonder if the new excuse any time a media company does something dodgy is “oh it was the AI sorry!”