• CosmicTurtle@lemmy.world
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    1
    ·
    9 months ago

    I think I used to do something similar with email spam traps. Not sure if it’s still around but basically you could help build NaCL lists by posting an email address on your website somewhere that was visible in the source code but not visible to normal users, like in a div that was way on the left side of the screen.

    Anyway, spammers that do regular expression searches for email addresses would email it and get their IPs added to naughty lists.

    I’d love to see something similar with robots.

    • Lvxferre@mander.xyz
      link
      fedilink
      English
      arrow-up
      32
      ·
      edit-2
      9 months ago

      Yup, it’s the same approach as email spam traps. Except the naughty list, but… holy fuck a shareable bot IP list is an amazing addition, it would increase the damage to those web crawling businesses.

      • Nighed@sffa.community
        link
        fedilink
        English
        arrow-up
        12
        ·
        9 months ago

        but with all of the cloud resources now, you can switch through IP addresses without any trouble. hell, you could just browse by IP6 and not even worry with how cheap those are!

        • Lvxferre@mander.xyz
          link
          fedilink
          English
          arrow-up
          12
          ·
          9 months ago

          Yeah, that throws a monkey wrench into the idea. That’s a shame, because “either respect robots.txt or you’re denied access to a lot of websites!” is appealing.

          • Nighed@sffa.community
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            5
            ·
            9 months ago

            That’s when Google’s browser DRM thing starts sounding like a good idea 😭