Not just tracking cookies, but browser fingerprinting.

Not just Google, but now Cloudflare.

  • mox@lemmy.sdf.orgOP
    link
    fedilink
    arrow-up
    8
    arrow-down
    13
    ·
    edit-2
    2 days ago

    CAPTCHAs make web sites awful to use, and waste the limited lifespans of billions of people.

    There are other ways to manage bots.

    • pixxelkick@lemmy.world
      link
      fedilink
      arrow-up
      16
      arrow-down
      2
      ·
      2 days ago

      Not easily, and not at the time, no, it really was a very easy way to quickly reduce bot problems at the time.

      You’d get random spam for stuff that could flood your forums or etc, and setting up captcha had an extremely immediate and palpable effect on reducing the spam that came in from random bot farms and shit.

      I can personally confirm that when I implemented captcha on my forums i maintained 14 years ago, it pretty substantially reduced spammers by a huge degree.

      • mox@lemmy.sdf.orgOP
        link
        fedilink
        arrow-up
        12
        arrow-down
        5
        ·
        edit-2
        2 days ago

        There’s no point in arguing what once was. Things have changed. CAPTCHAs are now less effective, far more invasive, and for many people, far more troublesome.

        Cling to them if you like. I no longer use them on any of my sites, because I care about my users.

    • echolalia@lemmy.ml
      link
      fedilink
      arrow-up
      7
      ·
      2 days ago

      I’m not a website administrator so I’m out of the loop. Other ways to manage bots? Like what?

      • mox@lemmy.sdf.orgOP
        link
        fedilink
        arrow-up
        17
        ·
        edit-2
        2 days ago

        What will be effective depends on the nature of the site and that of the bots causing trouble. For example, a forum can limit posting privileges until an account builds a reputation, a paid goods/services site can restrict access until a purchase is made, a web service can use revocable credentials, and a data download site can use rate limits. (That last one is actually useful in a variety of situations, and can be done at the network level instead of or in addition to the application level.)

        There is no silver bullet, but there are lots of small measures that can be very effective when applied thoughtfully, without turning a site into a frustrating-to-use surveillance tool for Google at the expense of the humans who want to or have to use it.

        Even a small, locally hosted, activate-only-once, simple image or text-based CAPTCHA would be preferable to the ones operated by third parties.

        • catloaf@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          So all I need to do to bot your sites is to farm accounts? Easy enough, people do that on Instagram at huge scales.

          • mox@lemmy.sdf.orgOP
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            2 days ago

            Good luck. You’ll find that your farmed accounts can’t do much of anything, and will be quickly and automatically deleted.