This video shows that Reddit refused to delete all comments and posts of its users when they close their account via a CCPA / GDPR request.

  • TWeaK@lemm.ee
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    Their explanation for restored content will likely be something about the nature of how their CDN works.

    Granted, this excuse won’t hold up much, but it’s probably true and will limit their liability in the sense that it isn’t intentional.

    I’ve deleted my comments multiple times with PowerDeleteSuite and had things come back, a couple times over. However now I’m going through with shreddit (github version) using my GDPR files. It’s taking a long time because things panic every so many comments (I’m backing up everything, on file 75 so far but still 46,000 lines left from a 75,000 line file, however it’s panicking less now that the comments are more recent) and I haven’t had it restore any of the links I’ve checked from that process.


    Reddit changed the way they display comments in the profile a few months back. Now, you only see a limited number of comments under New, Hot, Top & Controversial. These are the lists that most deletion services access. So, if you use PowerDeleteSuite or any other service it will likely miss things. In particular, I opened up links to my older Top comments, ran the script, then found it had completely ignored replies underneath my comment that had low but positive karma - these wouldn’t have appeared on the lists. My new list only went back about 3 months (although I think it’s about number of comments rather than time).

    You really need to use the GDPR files to get everything. These contain CSV files with links to every single post and comment you have. However, it seems that reddit are delaying following through with most requests until after 1 July, when API requests (such as those that shreddit uses) will be blocked.


    Also PSA don’t use the shreddit website, they charge you $15. The github version is free and will take CSV files with the appropriate tag. But, again, in my experience it panics and hangs fairly often, so it will take a lot of work to use. I’ve had to run it, back up the terminal output, use the last link and delete everything in posts.csv and comments.csv before the one it stuck on, then resume with ammended files.

    Reddit really isn’t making it easy to follow through with your rights. Make records of this, then this can be used to convince local Data Protection Authorities to collectively throw down a bigger hammer than Huffman ever wielded, or even imagined.

    Also another PSA, reddit’s terms do not deny you ownership of your content. So even if they try to claim ownership themselves (as Steve Huffman has frequently publicly stated) they cannot deny you the right to edit your content and restrict what they do with it. It’s your information, and reddit hasn’t even paid for it.

    You can’t sell a microwave without paying for the nuts and bolts.

    • May@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      it seems that reddit are delaying following through with most requests until after 1 July when API requests (such as those that shreddit uses) will be blocked.

      I was sooo worried about this and thinking that something like that would be done, back when i saw someone warn in the save 3rd party apps sub that u should request your data. Still i tried making a request bc i thought maybe reddit did not catch on yet or maybe bc it was before the blackout there can still be a chance, but till now i never got the data. :(

      probably i’ll just leave the comments and posts. I did not post a lot.

        • DrNeurohax@kbin.social
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          This did not get the traction it should have. It’s probably the best of the dozen-ish methods I’ve seen.

        • TWeaK@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          TL;DR get the 1.6GB Pushshift torrent, then edit a script to extract your data, then edit another script to use that data to overwrite your comments.

      • TWeaK@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        You can still use the GDPR files to get at all your comments, you just won’t be able to use existing API methods to automate it. However, perhaps it would be possible to use the links to automate via a scraping method or something - maybe the PowerDeleteSuite method could be expanded upon.

        • abff08f4813c@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Yeah, you are right. It’d be tough to directly modify PDS as that’s javascript in a browser and there are strict restrictions on what JS can do on a filesystem in that case.

          But maybe someone can create a browser extension that does the same job. Extensions have fewer restrictions so maybe it could be fueled by a file.

          Or maybe someone will some up with some kind of shell script that can read the archive and copy & paste the URLs for each of your posts and comments, one by one, into the javascript console of your browser, allowing PDS to take care of the rest (visiting each one and simulating hitting the edit and delete buttons).

          The other issue is that PDS depends on old dot reddit dot com currently from what I understand. If that ever gets dropped, PDS will break until it’s updated to work with new reddit.

      • DrNeurohax@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        There’s also a semi-auto delete user script that doesn’t use the API called so-long-reddit-thanks-for-all-the-fish.

        You go to your comments page, click a button, and it performs the actions within the browser. Without any further interaction, you’ll see the screen scroll to the bottom, click edit on the last comment, enter the text in the script (default is a link to the script, but you can change that to anything), click save, and move on to the next comment (pretty sure it can delete, too). For best results, use a neverending Reddit script and keep scrolling until there are no more pages loaded. Also, re-sort the comments by each option (top, newest, etc.) to check for any stragglers.

        You can still use your browser, though I recommend keeping the task in it’s own window (in case your browser or an addon unloads pages you haven’t accessed in x minutes). If you do something that makes the browser lag a little, it can cause the script to miss a comment, so you might need to run it twice. I used this on one account and it worked flawlessly for several thousand comments and skipped ~10, or so.

    • Bill Stickers
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Granted, this excuse won’t hold up much, but it’s probably true and will limit their liability in the sense that it isn’t intentional.

      GDPR. has been in place for years now. They’ll be fully liable for not setting up a system to comply.

      • TWeaK@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        They’ll be fully liable, but the fine will likely be less than if it were explicitly intentional.

    • DrNeurohax@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      If you’re using the main repo for PDS then you probably have the one that doesn’t pause fro 5 secs between API calls (Reddit’s limit). The first fork version has the pause and works correctly, though slowly. Just be aware that there’s a bug in PDS that stops adding to the exported file if it hits an error (If you have 100 comments and get an error on comment #15 it will continue to edit/delete, but the exported file will only have 14 comments.)