• Strawberry@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    4
    ·
    18 hours ago

    The bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can’t always just cache every possible generated page at the same time.