These scammers using Mr Beasts popularity, generosity, and (mostly) deep fake AI to scam people into downloading malware, somehow do not go against Instagrams community guidelines.

After trying to submit a request to review these denied claims, it appears I have been shadow banned in some way or another as only an error message pops up.

Instagram is allowing these to run on their platform. Intentional or not, this is ridiculous and Instagram should be held accountable for allowing malicious websites to advertise their scam on their platform.

For a platform of this scale, this is completely unacceptable. They are blatant and I have no idea how Instagrams report bots/staff are missing these.

  • EnderMB@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    2
    ·
    1 year ago

    Like many, I’ve reported lots of stuff to basically every social media outlet, and nothing has been done. Most surprising, a woman I know was getting harassed from people setting up fake accounts of her. Meta did nothing, so she went to the police…who also did nothing. Her MP eventually got involved, and after three months the accounts were removed, but the damage had gone on for about two years at that point.

    As someone that works in tech, it’s obvious why this is such a hard problem, because it requires actual people to review the content, to get context, and to resolve in a timely and efficient manner. It’s not a scalable solution on a platform with millions of posts a day, because it takes thousands (if not more) of people to triage, action, and build on this. That costs a ton of money, and tech companies have been trying (and failing) to scale this problem for decades. I maintain that if someone is able to reliably solve this problem (where users are happy), they’ll make billions.

    • jjjalljs@ttrpg.network
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 year ago

      I’m going to argue that if they can’t scale to millions of users safely they shouldn’t.

      If they were selling food at huge scales but “couldn’t afford to have quality checks on all of what they ship out”, most people probably wouldn’t be like “yeah that’s fine. I mean sometimes you get a whole rat in your captain crunch but they have to make a profit”

      Also I’m pretty sure a billionaire could afford to pay a whole army of moderators.

      On the other hand, as someone else said, they kind of go to bat for awful people more often than not. I don’t really want to see that behavior scaled up.

      • EnderMB@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        You’re probably right, but as a thought exercise, imagine how many people you would need to hire across multiple regions, and what sort of salary these people deserve to have, given the responsibility. That’s why these companies don’t want to pay for it, and anyone that has worked this kind of data entry work will know that it can be brutal.

        IMO, governments should enforce it, but that requires a combined effort across multiple governments.

    • Patches@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      But it is scalable. Do you have any idea how much fuckin money these social media sites make? They absolutely can afford it. We just don’t force them too.

    • Rodeo@lemmy.ca
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      That costs a ton of money

      As if they don’t have it?

      Fuckin please. I’m so sick of hearing that something to “too expensive” for a multi billion dollar, multinational corporation.

    • Facebones@reddthat.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I get a TOS flag anytime I mention that using one’s faith to justify bigotry and violence though, so we know there’s at least one group fb goes to bat for - Christofascists.