• Strawberry@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    1
    ·
    15 days ago

    The bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can’t always just cache every possible generated page at the same time.